Compare commits

...

10 Commits

Author SHA1 Message Date
7d601ea4c3 update 2024-01-15 11:09:20 +00:00
6a332a52a4 add 2024-01-14 16:05:47 +00:00
35c3d5f526 src 2024-01-13 23:02:48 +00:00
emdee
1a4db4ae79 updates 2023-12-07 19:51:08 +00:00
emdee
7a1999f117 Updated README 2022-11-29 15:27:16 +00:00
emdee
ec7c600d85 add exclude_badExits.txt 2022-11-29 14:52:48 +00:00
emdee
d08b34fd57 Added notice_log 2022-11-29 12:54:36 +00:00
emdee
204a6adc48 Async added and removed 2022-11-27 01:10:18 +00:00
emdee
08626942d3 changed contact to bad_on 2022-11-19 10:30:22 +00:00
emdee
aac3793b35 pep8 isort 2022-11-17 08:58:45 +00:00
20 changed files with 2907 additions and 1102 deletions

4
.gitignore vendored
View File

@ -4,6 +4,10 @@ __pycache__/
*.py[cod]
*$py.class
*~
*.junk
*.dst
# C extensions
*.so

1
LICENSE.md Normal file
View File

@ -0,0 +1 @@

25
Makefile Normal file
View File

@ -0,0 +1,25 @@
PREFIX=/usr/local
LOCAL_DOCTEST=${PREFIX}/bin/toxcore_run_doctest3.bash
DOCTEST=${LOCAL_DOCTEST}
MOD=exclude_badExits
check::
sh python3.sh -c "import ${MOD}"
lint::
sh .pylint.sh
install::
pip3.sh install --target ${PREFIX}/lib/python3.11/site-packages/ --upgrade .
rsync::
bash .rsync.sh
test:: doctest
doctest::
export PYTHONPATH=${PWD}/src/${MOD}
${DOCTEST} ${MOD}.txt
clean::
find * -name \*~ -delete

208
README.md
View File

@ -1,149 +1,141 @@
This extends nusenu's basic idea of using the stem library to
dynamically exclude nodes that are likely to be bad by putting them
on the ExcludeNodes or ExcludeExitNodes setting of a running Tor.
* https://github.com/nusenu/noContactInfo_Exit_Excluder
* https://github.com/TheSmashy/TorExitRelayExclude
The basic cut is to exclude Exit nodes that do not have a contact.
That can be extended to nodes that do not have an email in the contact etc.
The basic idea is to exclude Exit nodes that do not have ContactInfo:
* https://github.com/nusenu/ContactInfo-Information-Sharing-Specification
That can be extended to relays that do not have an email in the contact,
or to relays that do not have ContactInfo that is verified to include them.
But there's a problem, and your Tor notice.log will tell you about it:
you could exclude the nodes needed to access hidden services or
directorues. So we need to add to the process the concept of a whitelist.
you could exclude the relays needed to access hidden services or mirror
directories. So we need to add to the process the concept of a whitelist.
In addition, we may have our own blacklist of nodes we want to exclude,
or use these lists for other applications like selektor.
So we make two files that are structured in YAML:
```
/etc/tor/yaml/torrc-goodnodes.yaml
---
GoodNodes:
EntryNodes: []
Relays:
IntroductionPoints:
- NODEFINGERPRINT
...
# ExitNodes will be overwritten by this program
ExitNodes: []
IntroductionPoints: []
# use the Onions section to list onion services you want the
# Introduction Points whitelisted - these points may change daily
# Look in tor's notice.log for 'Every introduction point for service'
Onions: []
# use the Services list to list elays you want the whitelisted
# Look in tor's notice.log for 'Wanted to contact directory mirror'
Services: []
By default all sections of the goodnodes.yaml are used as a whitelist.
Use the GoodNodes/Onions list to list onion services you want the
Introduction Points whitelisted - these points may change daily
Look in tor's notice.log for warnings of 'Every introduction point for service'
```--hs_dir``` ```default='/var/lib/tor'``` will make the program
parse the files named ```hostname``` below this dir to find
Hidden Services to whitelist.
The Introduction Points can change during the day, so you may want to
rerun this program to freshen the list of Introduction Points. A full run
that processes all the relays from stem can take 30 minutes, or run with:
```--saved_only``` will run the program with just cached information
on the relats, but will update the Introduction Points from the Services.
/etc/tor/yaml/torrc-badnodes.yaml
BadNodes:
ExcludeExitNodes:
BadExit:
# $0000000000000000000000000000000000000007
# list the internet domains you know are bad so you don't
# waste time trying to download contacts from them.
ExcludeDomains: []
ExcludeNodes:
# BadExit will be overwritten by this program
BadExit: []
# list MyBadExit in --bad_sections if you want it used, to exclude nodes
# or any others as a list separated by comma(,)
MyBadExit: []
```
That part requires [PyYAML](https://pyyaml.org/wiki/PyYAML)
https://github.com/yaml/pyyaml/
https://github.com/yaml/pyyaml/ or ```ruamel```: do
```pip3 install ruamel``` or ```pip3 install PyYAML```;
the advantage of the former is that it preserves comments.
Right now only the ExcludeExitNodes section is used by we may add ExcludeNodes
later, and by default all sub-sections of the badnodes.yaml are used as a
ExcludeExitNodes but it can be customized with the lWanted commandline arg.
The original idea has also been extended to add different conditions for
exclusion: the ```--contact``` commandline arg is a comma sep list of conditions:
* Empty - no contact info
* NoEmail - no @ sign in the contact',
More may be added later.
(You may have to run this as the Tor user to get RW access to
/run/tor/control, in which case the directory for the YAML files must
be group Tor writeable, and its parent's directories group Tor RX.)
Because you don't want to exclude the introduction points to any onion
you want to connect to, ```--white_onions``` should whitelist the
introduction points to a comma sep list of onions, but is
currently broken in stem 1.8.0: see:
introduction points to a comma sep list of onions; we fixed stem to do this:
* https://github.com/torproject/stem/issues/96
* https://gitlab.torproject.org/legacy/trac/-/issues/25417
Use the GoodNodes/Onions list in goodnodes.yaml to list onion services
you want the Introduction Points whitelisted - these points may change daily.
Look in tor's notice.log for 'Every introduction point for service'
```notice_log``` will parse the notice log for warnings about relays and
services that will then be whitelisted.
```--torrc_output``` will write the torrc ExcludeNodes configuration to a file.
Now for the final part: we lookup the Contact info of every server
that is currently in our Tor, and check it for its existence.
If it fails to provide the well-know url, we assume its a bogus
relay and add it to a list of nodes that goes on ExcludeNodes -
not just exclude Exit.
If the Contact info is good we add the list of fingerprints to add
to ExitNodes, a whitelist of relays to use as exits.
```--proof_output``` will write the contact info as a ciiss dictionary
```--good_contacts``` will write the contact info as a ciiss dictionary
to a YAML file. If the proof is uri-rsa, the well-known file of fingerprints
is downloaded and the fingerprints are added on a 'fps' field we create
of that fingerprint's entry of the YAML dictionary. This file is read at the
beginning of the program to start with a trust database, and only new
contact info from new relays are added to the dictionary.
You can expect it to take an hour or two the first time this is run:
>700 domains.
Now for the final part: we lookup the Contact info of every relay
that is currently in our Tor, and check it the existence of the
well-known file that lists the fingerprints of the relays it runs.
If it fails to provide the well-know url, we assume its a bad
relay and add it to a list of nodes that goes on ```ExcludeNodes```
(not just ExcludeExitNodes```). If the Contact info is good, we add the
list of fingerprints to ```ExitNodes```, a whitelist of relays to use as exits.
```--bad_on``` We offer the users 3 levels of cleaning:
1. clean relays that have no contact ```=Empty```
2. clean relays that don't have an email in the contact (implies 1)
```=Empty,NoEmail```
3. clean relays that don't have "good' contactinfo. (implies 1)
```=Empty,NoEmail,NotGood```
The default is ```Empty,NoEmail,NotGood``` ; ```NoEmail``` is inherently imperfect
in that many of the contact-as-an-email are obfuscated, but we try anyway.
To be "good" the ContactInfo must:
1. have a url for the well-defined-file to be gotten
2. must have a file that can be gotten at the URL
3. must support getting the file with a valid SSL cert from a recognized authority
4. (not in the spec but added by Python) must use a TLS SSL > v1
5. must have a fingerprint list in the file
6. must have the FP that got us the contactinfo in the fingerprint list in the file.
```--wait_boot``` is the number of seconds to wait for Tor to booststrap
```--wellknown_output``` will make the program write the well-known files
(```/.well-known/tor-relay/rsa-fingerprint.txt```) to a directory.
```--torrc_output``` will write a file of the commands that it sends to
the Tor controller, so you can include it in a ```/etc/toc/torrc```.
```--relays_output write the download relays in json to a file. The relays
are downloaded from https://onionoo.torproject.org/details
For usage, do ```python3 exclude_badExits.py --help`
See [exclude_badExits.txt](./exclude_badExits.txt)
## Usage
```
usage: exclude_badExits.py [-h] [--https_cafile HTTPS_CAFILE]
[--proxy_host PROXY_HOST] [--proxy_port PROXY_PORT]
[--proxy_ctl PROXY_CTL] [--torrc TORRC]
[--timeout TIMEOUT] [--good_nodes GOOD_NODES]
[--bad_nodes BAD_NODES] [--contact CONTACT]
[--bad_contacts BAD_CONTACTS]
[--strict_nodes {0,1}] [--wait_boot WAIT_BOOT]
[--points_timeout POINTS_TIMEOUT]
[--log_level LOG_LEVEL]
[--bad_sections BAD_SECTIONS]
[--white_services WHITE_SERVICES]
[--torrc_output TORRC_OUTPUT]
[--proof_output PROOF_OUTPUT]
```
### Optional arguments:
```
-h, --help show this help message and exit
--https_cafile HTTPS_CAFILE
Certificate Authority file (in PEM)
```
```
--proxy_host PROXY_HOST, --proxy-host PROXY_HOST
proxy host
--proxy_port PROXY_PORT, --proxy-port PROXY_PORT
proxy control port
--proxy_ctl PROXY_CTL, --proxy-ctl PROXY_CTL
control socket - or port
```
```
--torrc TORRC torrc to check for suggestions
--timeout TIMEOUT proxy download connect timeout
```
```
--good_nodes GOOD_NODES
Yaml file of good info that should not be excluded
--bad_nodes BAD_NODES
Yaml file of bad nodes that should also be excluded
```
```
--contact CONTACT comma sep list of conditions - Empty,NoEmail
--bad_contacts BAD_CONTACTS
Yaml file of bad contacts that bad FPs are using
```
```
--strict_nodes {0,1} Set StrictNodes: 1 is less anonymous but more secure,
although some sites may be unreachable
--wait_boot WAIT_BOOT
Seconds to wait for Tor to booststrap
--points_timeout POINTS_TIMEOUT
Timeout for getting introduction points - must be long
>120sec. 0 means disabled looking for IPs
```
```
--log_level LOG_LEVEL
10=debug 20=info 30=warn 40=error
--bad_sections BAD_SECTIONS
sections of the badnodes.yaml to use, comma separated,
'' BROKEN
```
```
--white_services WHITE_SERVICES
comma sep. list of onions to whitelist their
introduction points - BROKEN
```
```
--torrc_output TORRC_OUTPUT
Write the torrc configuration to a file
--proof_output PROOF_OUTPUT
Write the proof data of the included nodes to a YAML
file
```

View File

@ -0,0 +1,23 @@
#!/bin/sh
PROG=exclude_badExits
build=build
dist=dist
# pyinstaller
if [ ! -e ${dist}/${PROG}.pyi -o ! ${dist}/${PROG}.pyi -nt ./${PROG}.py ] ; then
[ -f ${PROG}.spec ] || pyi-makespec ./${PROG}.py -F -c
[ -d ${build} ] || mkdir -p ${build}
[ -d ${dist} ] || mkdir -p ${dist}
[ -e ${dist}/${PROG}.pyi -a ${dist}/${PROG}.pyi -nt ./${PROG}.py ] || \
pyinstaller --distpath ${dist} --workpath ${build} \
--exclude tkinter --exclude matplotlib \
--exclude twisted --exclude jedi --exclude jaraco \
--exclude sphinx --exclude coverage --exclude nose \
--exclude PIL --exclude numpy --exclude OpenGL \
--exclude PySide2 --exclude PyQt5 --exclude IPython \
--onefile -c --ascii \
$PROG.py
# AttributeError: 'NoneType' object has no attribute 'groups'
# utils.py #400
fi
# cx_Freeze exclude_badExits.py

24
exclude_badExits-pki.bash Normal file
View File

@ -0,0 +1,24 @@
#!/bin/sh
# -*- mode: sh; fill-column: 75; tab-width: 8; coding: utf-8-unix -*-
PROG=exclude_badExits
build=build
dist=dist
# pyinstaller
if [ ! -e ${dist}/${PROG}.pyi -o ! ${dist}/${PROG}.pyi -nt ./${PROG}.py ] ; then
[ -f ${PROG}.spec ] || pyi-makespec ./${PROG}.py -F -c
[ -d ${build} ] || mkdir -p ${build}
[ -d ${dist} ] || mkdir -p ${dist}
[ -e ${dist}/${PROG}.pyi -a ${dist}/${PROG}.pyi -nt ./${PROG}.py ] || \
pyinstaller --distpath ${dist} --workpath ${build} \
--exclude tkinter --exclude matplotlib \
--exclude twisted --exclude jedi --exclude jaraco \
--exclude sphinx --exclude coverage --exclude nose \
--exclude PIL --exclude numpy --exclude OpenGL \
--exclude PySide2 --exclude PyQt5 --exclude IPython \
--onefile -c --ascii \
$PROG.py
# AttributeError: 'NoneType' object has no attribute 'groups'
# utils.py #400
fi
# cx_Freeze exclude_badExits.py

View File

@ -3,23 +3,38 @@
PROG=exclude_badExits.py
SOCKS_PORT=9050
SOCKS_HOST=127.0.0.1
CAFILE=/etc/ssl/certs/ca-certificates.crt
# an example of running exclude_badExits with full debugging
# expected to take an hour or so
declare -a LARGS
LARGS=(
--log_level 10
)
# you may have a special python for installed packages
EXE=`which python3.bash`
LARGS+=(
--strict_nodes 0
--points_timeout 120
--proxy-host 127.0.0.1
--proxy-port $SOCKS_PORT
[ -f exclude_badExits.hlp ] || \
$EXE exclude_badExits.py --help > exclude_badExits.hlp
[ -f README.md ] || \
$EXE -c 'from exclude_badExits import __doc__; print(__doc__)' > README.md
# an example of running exclude_badExits with full debugging
# expected to 20 minutes or so
declare -a LARGS
LARGS=(
# --saved_only
# --strict_nodes 1
--points_timeout 150
--log_level 10
--https_cafile $CAFILE
)
[ -z "$socks_proxy" ] || \
LARGS+=(
--proxy-host $SOCKS_HOST
--proxy-port $SOCKS_PORT
)
if [ -f /var/lib/tor/.SelekTOR/3xx/cache/9050/notice.log ] ; then
LARGS+=(--notice_log /var/lib/tor/.SelekTOR/3xx/cache/9050/notice.log)
fi
if [ -d /var/lib/tor/hs ] ; then
LARGS+=( --hs_dir /var/lib/tor/hs )
fi
if [ -f '/run/tor/control' ] ; then
LARGS+=(--proxy-ctl '/run/tor/control' )
@ -34,8 +49,9 @@ LARGS+=( --white_onions $ddg )
# you may need to be the tor user to read /run/tor/control
grep -q ^debian-tor /etc/group && TORU=debian-tor || {
grep -q ^tor /etc/group && TORU=tor
}
sudo -u $TORU $EXE exclude_badExits.py "${LARGS[@]}" \
}
# --saved_only
sudo -u $TORU $EXE src/exclude_badExits/exclude_badExits.py "${LARGS[@]}" "$@" \
2>&1|tee exclude_badExits6.log
# The DEBUG statements contain the detail of why the relay was considered bad.

76
exclude_badExits.hlp Normal file
View File

@ -0,0 +1,76 @@
usage: exclude_badExits.py [-h] [--https_cafile HTTPS_CAFILE]
[--proxy_host PROXY_HOST] [--proxy_port PROXY_PORT]
[--proxy_ctl PROXY_CTL] [--torrc TORRC]
[--timeout TIMEOUT] [--good_nodes GOOD_NODES]
[--bad_nodes BAD_NODES] [--bad_on BAD_ON]
[--bad_contacts BAD_CONTACTS] [--saved_only]
[--strict_nodes {0,1}] [--wait_boot WAIT_BOOT]
[--points_timeout POINTS_TIMEOUT]
[--log_level LOG_LEVEL]
[--bad_sections BAD_SECTIONS]
[--white_onions WHITE_ONIONS]
[--torrc_output TORRC_OUTPUT] [--hs_dir HS_DIR]
[--notice_log NOTICE_LOG]
[--relays_output RELAYS_OUTPUT]
[--wellknown_output WELLKNOWN_OUTPUT]
[--good_contacts GOOD_CONTACTS]
optional arguments:
-h, --help show this help message and exit
--https_cafile HTTPS_CAFILE
Certificate Authority file (in PEM)
--proxy_host PROXY_HOST, --proxy-host PROXY_HOST
proxy host
--proxy_port PROXY_PORT, --proxy-port PROXY_PORT
proxy socks port
--proxy_ctl PROXY_CTL, --proxy-ctl PROXY_CTL
control socket - or port
--torrc TORRC torrc to check for suggestions
--timeout TIMEOUT proxy download connect timeout
--good_nodes GOOD_NODES
Yaml file of good info that should not be excluded
--bad_nodes BAD_NODES
Yaml file of bad nodes that should also be excluded
--bad_on BAD_ON comma sep list of conditions - Empty,NoEmail,NotGood
--bad_contacts BAD_CONTACTS
Yaml file of bad contacts that bad FPs are using
--saved_only Just use the info in the last *.yaml files without
querying the Tor controller
--strict_nodes {0,1} Set StrictNodes: 1 is less anonymous but more secure,
although some onion sites may be unreachable
--wait_boot WAIT_BOOT
Seconds to wait for Tor to booststrap
--points_timeout POINTS_TIMEOUT
Timeout for getting introduction points - must be long
>120sec. 0 means disabled looking for IPs
--log_level LOG_LEVEL
10=debug 20=info 30=warn 40=error
--bad_sections BAD_SECTIONS
sections of the badnodes.yaml to use, in addition to
BadExit, comma separated
--white_onions WHITE_ONIONS
comma sep. list of onions to whitelist their
introduction points - BROKEN
--torrc_output TORRC_OUTPUT
Write the torrc configuration to a file
--hs_dir HS_DIR Parse the files name hostname below this dir to find
Hidden Services to whitelist
--notice_log NOTICE_LOG
Parse the notice log for relays and services
--relays_output RELAYS_OUTPUT
Write the download relays in json to a file
--wellknown_output WELLKNOWN_OUTPUT
Write the well-known files to a directory
--good_contacts GOOD_CONTACTS
Write the proof data of the included nodes to a YAML
file
This extends nusenu's basic idea of using the stem library to dynamically
exclude nodes that are likely to be bad by putting them on the ExcludeNodes or
ExcludeExitNodes setting of a running Tor. *
https://github.com/nusenu/noContactInfo_Exit_Excluder *
https://github.com/TheSmashy/TorExitRelayExclude The basic idea is to exclude
Exit nodes that do not have ContactInfo: *
https://github.com/nusenu/ContactInfo-Information-Sharing-Specification That
can be extended to relays that do not have an email in the contact, or to
relays that do not have ContactInfo that is verified to include them.

1
exclude_badExits.md Normal file
View File

@ -0,0 +1 @@
None

View File

@ -1,833 +0,0 @@
# -*- mode: python; indent-tabs-mode: nil; py-indent-offset: 4; coding: utf-8 -*-
# https://github.com/nusenu/noContactInfo_Exit_Excluder
# https://github.com/TheSmashy/TorExitRelayExclude
"""
This extends nusenu's basic idea of using the stem library to
dynamically exclude nodes that are likely to be bad by putting them
on the ExcludeNodes or ExcludeExitNodes setting of a running Tor.
* https://github.com/nusenu/noContactInfo_Exit_Excluder
* https://github.com/TheSmashy/TorExitRelayExclude
The basic cut is to exclude Exit nodes that do not have a contact.
That can be extended to nodes that do not have an email in the contact etc.
"""
"""But there's a problem, and your Tor notice.log will tell you about it:
you could exclude the nodes needed to access hidden services or
directorues. So we need to add to the process the concept of a whitelist.
In addition, we may have our own blacklist of nodes we want to exclude,
or use these lists for other applications like selektor.
So we make two files that are structured in YAML:
```
/etc/tor/yaml/torrc-goodnodes.yaml
GoodNodes:
Relays:
IntroductionPoints:
- NODEFINGERPRINT
...
By default all sections of the goodnodes.yaml are used as a whitelist.
/etc/tor/yaml/torrc-badnodes.yaml
BadNodes:
ExcludeExitNodes:
BadExit:
# $0000000000000000000000000000000000000007
```
That part requires [PyYAML](https://pyyaml.org/wiki/PyYAML)
https://github.com/yaml/pyyaml/
Right now only the ExcludeExitNodes section is used by we may add ExcludeNodes
later, and by default all sub-sections of the badnodes.yaml are used as a
ExcludeExitNodes but it can be customized with the lWanted commandline arg.
The original idea has also been extended to add different conditions for
exclusion: the ```--contact``` commandline arg is a comma sep list of conditions:
* Empty - no contact info
* NoEmail - no @ sign in the contact',
More may be added later.
Because you don't want to exclude the introduction points to any onion
you want to connect to, ```--white_onions``` should whitelist the
introduction points to a comma sep list of onions, but is
currently broken in stem 1.8.0: see:
* https://github.com/torproject/stem/issues/96
* https://gitlab.torproject.org/legacy/trac/-/issues/25417
```--torrc_output``` will write the torrc ExcludeNodes configuration to a file.
Now for the final part: we lookup the Contact info of every server
that is currently in our Tor, and check it for its existence.
If it fails to provide the well-know url, we assume its a bogus
relay and add it to a list of nodes that goes on ExcludeNodes -
not just exclude Exit.
If the Contact info is good we add the list of fingerprints to add
to ExitNodes, a whitelist of relays to use as exits.
```--good_contacts``` will write the contact info as a ciiss dictionary
to a YAML file. If the proof is uri-rsa, the well-known file of fingerprints
is downloaded and the fingerprints are added on a 'fps' field we create
of that fingerprint's entry of the YAML dictionary. This file is read at the
beginning of the program to start with a trust database, and only new
contact info from new relays are added to the dictionary.
You can expect it to take an hour or two the first time this is run:
>700 domains.
For usage, do ```python3 exclude_badExits.py --help`
"""
import sys
import os
import re
import socket
import time
import argparse
import string
from io import StringIO
import ipaddress
# list(ipaddress._find_address_range(ipaddress.IPv4Network('172.16.0.0/12'))
from urllib3.util.ssl_match_hostname import CertificateError
import stem
from stem import InvalidRequest
from stem.control import Controller
from stem.connection import IncorrectPassword
from stem.util.tor_tools import is_valid_fingerprint
try:
from ruamel.yaml import YAML
yaml = YAML(typ='rt')
yaml.indent(mapping=2, sequence=2)
safe_load = yaml.load
except:
yaml = None
if yaml is None:
try:
import yaml
safe_load = yaml.safe_load
except:
yaml = None
try:
from unbound import ub_ctx,RR_TYPE_TXT,RR_CLASS_IN
except:
ub_ctx = RR_TYPE_TXT = RR_CLASS_IN = None
global LOG
import logging
import warnings
warnings.filterwarnings('ignore')
LOG = logging.getLogger()
from support_phantompy import vsetup_logging
from trustor_poc import oDownloadUrlUrllib3 as oDownloadUrl
from trustor_poc import idns_validate, TrustorError
from support_onions import icheck_torrc, bAreWeConnected, lIntroductionPoints, zResolveDomain, vwait_for_controller, yKNOWN_NODNS
LOG.info("imported HTTPSAdapter")
ETC_DIR = '/etc/tor/yaml'
aTRUST_DB = {}
aTRUST_DB_INDEX = {}
aFP_EMAIL = {}
sDETAILS_URL = "https://metrics.torproject.org/rs.html#details/"
# You can call this while bootstrapping
sEXCLUDE_EXIT_KEY = 'ExcludeNodes'
sINCLUDE_EXIT_KEY = 'ExitNodes'
sINCLUDE_GUARD_KEY = 'EntryNodes'
def oMakeController(sSock='', port=9051):
import getpass
if sSock and os.path.exists(sSock):
controller = Controller.from_socket_file(path=sSock)
else:
controller = Controller.from_port(port=port)
sys.stdout.flush()
p = getpass.unix_getpass(prompt='Controller Password: ', stream=sys.stderr)
controller.authenticate(p)
return controller
oBAD_NODES = {}
oBAD_ROOT = 'BadNodes'
oBAD_NODES[oBAD_ROOT] = {}
oBAD_NODES[oBAD_ROOT]['ExcludeNodes'] = {}
lKNOWN_NODNS = []
lMAYBE_NODNS = []
def lYamlBadNodes(sFile,
section=sEXCLUDE_EXIT_KEY,
lWanted=['BadExit']):
global oBAD_NODES
global lKNOWN_NODNS
global lMAYBE_NODNS
l = []
if not yaml: return l
if os.path.exists(sFile):
with open(sFile, 'rt') as oFd:
oBAD_NODES = safe_load(oFd)
# BROKEN
# root = 'ExcludeNodes'
# for elt in o[oBAD_ROOT][root][section].keys():
# if lWanted and elt not in lWanted: continue
# # l += o[oBAD_ROOT][root][section][elt]
l = oBAD_NODES[oBAD_ROOT]['ExcludeNodes']['BadExit']
root = 'ExcludeDomains'
if root not in oBAD_NODES[oBAD_ROOT] or not oBAD_NODES[oBAD_ROOT][root]:
lMAYBE_NODNS = safe_load(StringIO(yKNOWN_NODNS))
else:
lMAYBE_NODNS = oBAD_NODES[oBAD_ROOT][root]
return l
oGOOD_NODES = {}
oGOOD_ROOT = 'GoodNodes'
def lYamlGoodNodes(sFile='/etc/tor/torrc-goodnodes.yaml'):
global oGOOD_NODES
root = oGOOD_ROOT
l = []
if not yaml: return l
if os.path.exists(sFile):
with open(sFile, 'rt') as oFd:
o = safe_load(oFd)
oGOOD_NODES = o
if 'GuardNodes' in o[oGOOD_ROOT].keys():
l = o[oGOOD_ROOT]['GuardNodes']
# yq '.Nodes.IntroductionPoints|.[]' < /etc/tor/torrc-goodnodes.yaml
return l
def bdomain_is_bad(domain, fp):
global lKNOWN_NODNS
if domain in lKNOWN_NODNS: return True
if domain in lMAYBE_NODNS:
ip = zResolveDomain(domain)
if ip == '':
LOG.debug(f"{fp} {domain} does not resolve")
lKNOWN_NODNS.append(domain)
lMAYBE_NODNS.remove(domain)
return True
for elt in '@(){}$!':
if elt in domain:
LOG.warn(f"{elt} in domain {domain}")
return True
return False
tBAD_URLS = set()
lATS = ['abuse', 'email']
lINTS = ['ciissversion', 'uplinkbw', 'signingkeylifetime', 'memory']
lBOOLS = ['dnssec', 'dnsqname', 'aesni', 'autoupdate', 'dnslocalrootzone',
'sandbox', 'offlinemasterkey']
def aVerifyContact(a, fp, https_cafile, timeout=20, host='127.0.0.1', port=9050):
global tBAD_URLS
global lKNOWN_NODNS
# cleanups for yaml
for elt in lINTS:
if elt in a:
a[elt] = int(a[elt])
for elt in lBOOLS:
if elt in a:
if a[elt] in ['y','yes', 'true', 'True']:
a[elt] = True
else:
a[elt] = False
for elt in lATS:
if elt in a:
a[elt] = a[elt].replace('[]', '@')
a.update({'fps': []})
keys = list(a.keys())
if 'email' not in keys:
LOG.warn(f"{fp} 'email' not in {keys}")
a['email'] = ''
if 'ciissversion' not in keys:
aFP_EMAIL[fp] = a['email']
LOG.warn(f"{fp} 'ciissversion' not in {keys}")
a['ciissversion'] = 2
# test the url for fps and add it to the array
if 'proof' not in keys:
aFP_EMAIL[fp] = a['email']
LOG.warn(f"{fp} 'proof' not in {keys}")
return a
if aTRUST_DB_INDEX and fp in aTRUST_DB_INDEX.keys():
aCachedContact = aTRUST_DB_INDEX[fp]
if aCachedContact['email'] == a['email']:
LOG.info(f"{fp} in aTRUST_DB_INDEX")
return aCachedContact
if 'url' not in keys:
if 'uri' not in keys:
a['url'] = ''
aFP_EMAIL[fp] = a['email']
LOG.warn(f"{fp} url and uri not in {keys}")
return a
a['url'] = a['uri']
aFP_EMAIL[fp] = a['email']
LOG.debug(f"{fp} 'uri' but not 'url' in {keys}")
# drop through
c = a['url'].lstrip('https://').lstrip('http://').strip('/')
a['url'] = 'https://' +c
# domain should be a unique key for contacts
domain = a['url'][8:]
if bdomain_is_bad(domain, fp):
LOG.warn(f"{domain} is bad from {a['url']}")
LOG.debug(f"{fp} is bad from {a}")
return a
ip = zResolveDomain(domain)
if ip == '':
aFP_EMAIL[fp] = a['email']
LOG.debug(f"{fp} {domain} does not resolve")
lKNOWN_NODNS.append(domain)
return {}
if a['proof'] not in ['uri-rsa']:
# only support uri for now
if False and ub_ctx:
fp_domain = fp +'.'+domain
if idns_validate(fp_domain,
libunbound_resolv_file='resolv.conf',
dnssec_DS_file='dnssec-root-trust',
) == 0:
pass
LOG.warn(f"{fp} proof={a['proof']} not supported yet")
return a
LOG.debug(f"{len(keys)} contact fields for {fp}")
url="https://"+domain+"/.well-known/tor-relay/rsa-fingerprint.txt"
try:
LOG.debug(f"Downloading from {domain} for {fp}")
o = oDownloadUrl(url, https_cafile,
timeout=timeout, host=host, port=port)
# requests response: text "reason", "status_code"
except AttributeError as e:
LOG.exception(f"AttributeError downloading from {domain} {e}")
except CertificateError as e:
LOG.warn(f"CertificateError downloading from {domain} {e}")
tBAD_URLS.add(a['url'])
except TrustorError as e:
if e.args == "HTTP Errorcode 404":
aFP_EMAIL[fp] = a['email']
LOG.warn(f"TrustorError 404 from {domain} {e.args}")
else:
LOG.warn(f"TrustorError downloading from {domain} {e.args}")
tBAD_URLS.add(a['url'])
except (BaseException ) as e:
LOG.error(f"Exception {type(e)} downloading from {domain} {e}")
else:
if hasattr(o, 'status'):
status_code = o.status
else:
status_code = o.status_code
if status_code >= 300:
aFP_EMAIL[fp] = a['email']
LOG.warn(f"Error from {domain} {status_code} {o.reason}")
# any reason retry?
tBAD_URLS.add(a['url'])
return a
if hasattr(o, 'text'):
data = o.text
else:
data = str(o.data, 'UTF-8')
l = data.upper().strip().split('\n')
LOG.debug(f"Downloaded from {domain} {len(l)} lines {len(data)} bytes")
a['modified'] = int(time.time())
if not l:
LOG.warn(f"Downloading from {domain} empty for {fp}")
else:
a['fps'] = [elt for elt in l if elt and len(elt) == 40
and not elt.startswith('#')]
LOG.info(f"Downloaded from {domain} {len(a['fps'])} FPs")
return a
def aParseContactYaml(contact, fp):
"""
See the Tor ContactInfo Information Sharing Specification v2
https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/
"""
lelts = contact.split()
a = {}
if len(lelts) % 1 != 0:
LOG.warn(f"bad contact for {fp} odd number of components")
LOG.debug(f"{fp} {a}")
return a
key = ''
for elt in lelts:
if key == '':
key = elt
continue
a[key] = elt
key = ''
LOG.debug(f"{fp} {len(a.keys())} fields")
return a
def aParseContact(contact, fp):
"""
See the Tor ContactInfo Information Sharing Specification v2
https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/
"""
l = [line for line in contact.strip().replace('"', '').split(' ')
if ':' in line]
LOG.debug(f"{fp} {len(l)} fields")
s = f'"{fp}":\n'
s += '\n'.join([f" {line}\"".replace(':',': \"', 1)
for line in l])
oFd = StringIO(s)
a = safe_load(oFd)
return a
def oMainArgparser(_=None):
try:
from OpenSSL import SSL
lCAfs = SSL._CERTIFICATE_FILE_LOCATIONS
except:
lCAfs = []
CAfs = []
for elt in lCAfs:
if os.path.exists(elt):
CAfs.append(elt)
if not CAfs:
CAfs = ['']
parser = argparse.ArgumentParser(add_help=True,
epilog=__doc__)
parser.add_argument('--https_cafile', type=str,
help="Certificate Authority file (in PEM)",
default=CAfs[0])
parser.add_argument('--proxy_host', '--proxy-host', type=str,
default='127.0.0.1',
help='proxy host')
parser.add_argument('--proxy_port', '--proxy-port', default=9050, type=int,
help='proxy control port')
parser.add_argument('--proxy_ctl', '--proxy-ctl',
default='/run/tor/control' if os.path.exists('/run/tor/control') else 9051,
type=str,
help='control socket - or port')
parser.add_argument('--torrc',
default='/etc/tor/torrc-defaults',
type=str,
help='torrc to check for suggestions')
parser.add_argument('--timeout', default=60, type=int,
help='proxy download connect timeout')
parser.add_argument('--good_nodes', type=str,
default=os.path.join(ETC_DIR, 'goodnodes.yaml'),
help="Yaml file of good info that should not be excluded")
parser.add_argument('--bad_nodes', type=str,
default=os.path.join(ETC_DIR, 'badnodes.yaml'),
help="Yaml file of bad nodes that should also be excluded")
parser.add_argument('--contact', type=str, default='Empty,NoEmail',
help="comma sep list of conditions - Empty,NoEmail")
parser.add_argument('--bad_contacts', type=str,
default=os.path.join(ETC_DIR, 'badcontacts.yaml'),
help="Yaml file of bad contacts that bad FPs are using")
parser.add_argument('--strict_nodes', type=int, default=0,
choices=[0,1],
help="Set StrictNodes: 1 is less anonymous but more secure, although some sites may be unreachable")
parser.add_argument('--wait_boot', type=int, default=120,
help="Seconds to wait for Tor to booststrap")
parser.add_argument('--points_timeout', type=int, default=0,
help="Timeout for getting introduction points - must be long >120sec. 0 means disabled looking for IPs")
parser.add_argument('--log_level', type=int, default=20,
help="10=debug 20=info 30=warn 40=error")
parser.add_argument('--bad_sections', type=str,
default='MyBadExit',
help="sections of the badnodes.yaml to use, comma separated, '' BROKEN")
parser.add_argument('--white_onions', type=str,
default='',
help="comma sep. list of onions to whitelist their introduction points - BROKEN")
parser.add_argument('--torrc_output', type=str,
default=os.path.join(ETC_DIR, 'torrc.new'),
help="Write the torrc configuration to a file")
parser.add_argument('--good_contacts', type=str, default=os.path.join(ETC_DIR, 'goodcontacts.yaml'),
help="Write the proof data of the included nodes to a YAML file")
return parser
def vwrite_badnodes(oArgs, oBAD_NODES, slen):
if oArgs.bad_nodes:
tmp = oArgs.bad_nodes +'.tmp'
bak = oArgs.bad_nodes +'.bak'
with open(tmp, 'wt') as oFYaml:
yaml.dump(oBAD_NODES, oFYaml)
LOG.info(f"Wrote {slen} to {oArgs.bad_nodes}")
oFYaml.close()
if os.path.exists(oArgs.bad_nodes):
os.rename(oArgs.bad_nodes, bak)
os.rename(tmp, oArgs.bad_nodes)
def vwrite_goodnodes(oArgs, oGOOD_NODES, ilen):
if oArgs.good_nodes:
tmp = oArgs.good_nodes +'.tmp'
bak = oArgs.good_nodes +'.bak'
with open(tmp, 'wt') as oFYaml:
yaml.dump(oGOOD_NODES, oFYaml)
LOG.info(f"Wrote {ilen} good relays to {oArgs.good_nodes}")
oFYaml.close()
if os.path.exists(oArgs.good_nodes):
os.rename(oArgs.good_nodes, bak)
os.rename(tmp, oArgs.good_nodes)
def iMain(lArgs):
global aTRUST_DB
global aTRUST_DB_INDEX
global oBAD_NODES
global oGOOD_NODES
global lKNOWN_NODNS
parser = oMainArgparser()
oArgs = parser.parse_args(lArgs)
vsetup_logging(oArgs.log_level)
if bAreWeConnected() is False:
raise SystemExit("we are not connected")
sFile = oArgs.torrc
if sFile and os.path.exists(sFile):
icheck_torrc(sFile, oArgs)
twhitelist_set = set()
sFile = oArgs.good_contacts
if sFile and os.path.exists(sFile):
try:
with open(sFile, 'rt') as oFd:
aTRUST_DB = safe_load(oFd)
LOG.info(f"{len(aTRUST_DB.keys())} trusted contacts from {sFile}")
# reverse lookup of fps to contacts
# but...
for k,v in aTRUST_DB.items():
if 'modified' not in v.keys():
v['modified'] = int(time.time())
aTRUST_DB_INDEX[k] = v
if 'fps' in aTRUST_DB[k].keys():
for fp in aTRUST_DB[k]['fps']:
if fp in aTRUST_DB_INDEX:
continue
aTRUST_DB_INDEX[fp] = v
LOG.info(f"{len(aTRUST_DB_INDEX.keys())} good relays from {sFile}")
except Exception as e:
LOG.exception(f"Error reading YAML TrustDB {sFile} {e}")
if os.path.exists(oArgs.proxy_ctl):
controller = oMakeController(sSock=oArgs.proxy_ctl)
else:
port =int(oArgs.proxy_ctl)
controller = oMakeController(port=port)
vwait_for_controller(controller, oArgs.wait_boot)
if oArgs.good_contacts:
good_contacts_tmp = oArgs.good_contacts + '.tmp'
elt = controller.get_conf('UseMicrodescriptors')
if elt != '0' :
LOG.error('"UseMicrodescriptors 0" is required in your /etc/tor/torrc. Exiting.')
controller.set_conf('UseMicrodescriptors', 0)
# does it work dynamically?
return 2
elt = controller.get_conf(sEXCLUDE_EXIT_KEY)
if elt and elt != '{??}':
LOG.warn(f"{sEXCLUDE_EXIT_KEY} is in use already")
twhitelist_set.update(set(lYamlGoodNodes(oArgs.good_nodes)))
LOG.info(f"lYamlGoodNodes {len(twhitelist_set)} GuardNodes from {oArgs.good_nodes}")
global oGOOD_NODES
t = set()
if 'IntroductionPoints' in oGOOD_NODES[oGOOD_ROOT]['Relays'].keys():
t = set(oGOOD_NODES[oGOOD_ROOT]['Relays']['IntroductionPoints'])
w = set()
if 'Services' in oGOOD_NODES[oGOOD_ROOT].keys():
# 'Onions' can I use the IntroductionPoints for Services too?
# w = set(oGOOD_NODES[oGOOD_ROOT]['Services'])
pass
if 'Onions' in oGOOD_NODES[oGOOD_ROOT].keys():
# Provides the descriptor for a hidden service. The **address** is the
# '.onion' address of the hidden service
w = set(oGOOD_NODES[oGOOD_ROOT]['Onions'])
if oArgs.white_onions:
w.update(oArgs.white_onions.split(','))
if oArgs.points_timeout > 0:
LOG.info(f"{len(w)} services will be checked from IntroductionPoints")
t.update(lIntroductionPoints(controller, w, itimeout=oArgs.points_timeout))
if len(t) > 0:
LOG.info(f"IntroductionPoints {len(t)} relays from {len(w)} services")
twhitelist_set.update(t)
texclude_set = set()
if oArgs.bad_nodes and os.path.exists(oArgs.bad_nodes):
if False and oArgs.bad_sections:
# BROKEN
sections = oArgs.bad_sections.split(',')
texclude_set = set(lYamlBadNodes(oArgs.bad_nodes,
lWanted=sections,
section=sEXCLUDE_EXIT_KEY))
LOG.info(f"Preloaded {len(texclude_set)} bad fps")
ttrust_db_index = aTRUST_DB_INDEX.keys()
tdns_urls = set()
iFakeContact = 0
iTotalContacts = 0
aBadContacts = {}
lConds = oArgs.contact.split(',')
iR = 0
relays = controller.get_server_descriptors()
for relay in relays:
iR += 1
if not is_valid_fingerprint(relay.fingerprint):
LOG.warn('Invalid Fingerprint: %s' % relay.fingerprint)
continue
relay.fingerprint = relay.fingerprint.upper()
sofar = f"G:{len(aTRUST_DB.keys())} U:{len(tdns_urls)} F:{iFakeContact} BF:{len(texclude_set)} GF:{len(ttrust_db_index)} TC:{iTotalContacts} #{iR}"
if not relay.exit_policy.is_exiting_allowed():
if sEXCLUDE_EXIT_KEY == 'ExcludeNodes':
pass # LOG.debug(f"{relay.fingerprint} not an exit {sofar}")
else:
pass # LOG.warn(f"{relay.fingerprint} not an exit {sofar}")
# continue
# great contact had good fps and we are in them
if relay.fingerprint in aTRUST_DB_INDEX.keys():
# a cached entry
continue
if type(relay.contact) == bytes:
# dunno
relay.contact = str(relay.contact, 'UTF-8')
if ('Empty' in lConds and not relay.contact) or \
('NoEmail' in lConds and relay.contact and not 'email:' in relay.contact):
texclude_set.add(relay.fingerprint)
continue
if not relay.contact or not 'ciissversion:' in relay.contact:
# should be unreached 'Empty' should always be in lConds
continue
iTotalContacts += 1
fp = relay.fingerprint
if relay.contact and not 'url:' in relay.contact:
LOG.info(f"{fp} skipping bad contact - no url: {sofar}")
LOG.debug(f"{fp} {relay.contact} {sofar}")
texclude_set.add(fp)
continue
c = relay.contact.lower()
# first rough cut
i = c.find('url:')
if i >=0:
c = c[i+4:]
i = c.find(' ')
if i >=0: c = c[:i]
c = c.lstrip('https://').lstrip('http://').strip('/')
i = c.find('/')
if i >=0: c = c[:i]
domain = c
if domain and bdomain_is_bad(domain, fp):
LOG.info(f"{fp} skipping bad {domain} {sofar}")
LOG.debug(f"{fp} {relay.contact} {sofar}")
texclude_set.add(fp)
continue
if domain:
ip = zResolveDomain(domain)
if not ip:
LOG.warn(f"{fp} {domain} did not resolve {sofar}")
texclude_set.add(fp)
lKNOWN_NODNS.append(domain)
iFakeContact += 1
continue
if 'dns-rsa' in relay.contact.lower():
target = f"{relay.fingerprint}.{domain}"
LOG.info(f"skipping 'dns-rsa' {target} {sofar}")
tdns_urls.add(target)
elif 'proof:uri-rsa' in relay.contact.lower():
a = aParseContact(relay.contact, relay.fingerprint)
if not a:
LOG.warn(f"{relay.fingerprint} did not parse {sofar}")
texclude_set.add(relay.fingerprint)
continue
if 'url' in a and a['url']:
if a['url'] in tBAD_URLS:
# The fp is using a contact with a URL we know is bad
LOG.info(f"{relay.fingerprint} skipping in tBAD_URLS {a['url']} {sofar}")
LOG.debug(f"{relay.fingerprint} {a} {sofar}")
iFakeContact += 1
texclude_set.add(relay.fingerprint)
continue
domain = a['url'].replace('https://', '').replace('http://', '')
if domain in lKNOWN_NODNS:
# The fp is using a contact with a URL we know is bogus
LOG.info(f"{relay.fingerprint} skipping in lKNOWN_NODNS {a['url']} {sofar}")
LOG.debug(f"{relay.fingerprint} {a} {sofar}")
iFakeContact += 1
texclude_set.add(relay.fingerprint)
continue
b = aVerifyContact(list(a.values())[0],
relay.fingerprint,
oArgs.https_cafile,
timeout=oArgs.timeout,
host=oArgs.proxy_host,
port=oArgs.proxy_port)
if not b or not 'fps' in b or not b['fps'] or not b['url']:
LOG.warn(f"{relay.fingerprint} did NOT VERIFY {sofar}")
LOG.debug(f"{relay.fingerprint} {b} {sofar}")
# If it's giving contact info that doesnt check out
# it could be a bad exit with fake contact info
texclude_set.add(relay.fingerprint)
aBadContacts[relay.fingerprint] = b
continue
if relay.fingerprint not in b['fps']:
LOG.warn(f"{relay.fingerprint} the FP IS NOT in the list of fps {sofar}")
# assume a fp is using a bogus contact
texclude_set.add(relay.fingerprint)
iFakeContact += 1
aBadContacts[relay.fingerprint] = b
continue
LOG.info(f"{relay.fingerprint} verified {b['url']} {sofar}")
# add our contact info to the trustdb
aTRUST_DB[relay.fingerprint] = b
for elt in b['fps']:
aTRUST_DB_INDEX[elt] = b
if oArgs.good_contacts and oArgs.log_level <= 20:
# as we go along then clobber
with open(good_contacts_tmp, 'wt') as oFYaml:
yaml.dump(aTRUST_DB, oFYaml)
oFYaml.close()
LOG.info(f"Filtered {len(twhitelist_set)} whitelisted relays")
texclude_set = texclude_set.difference(twhitelist_set)
# accept the dns-rsa urls for now until we test them
texclude_set = texclude_set.difference(tdns_urls)
LOG.info(f"{len(list(aTRUST_DB.keys()))} good contacts out of {iTotalContacts}")
if oArgs.torrc_output and texclude_set:
with open(oArgs.torrc_output, 'wt') as oFTorrc:
oFTorrc.write(f"{sEXCLUDE_EXIT_KEY} {','.join(texclude_set)}\n")
oFTorrc.write(f"{sINCLUDE_EXIT_KEY} {','.join(aTRUST_DB_INDEX.keys())}\n")
oFTorrc.write(f"{sINCLUDE_GUARD_KEY} {','.join(oGOOD_NODES[oGOOD_ROOT]['GuardNodes'])}\n")
LOG.info(f"Wrote tor configuration to {oArgs.torrc_output}")
oFTorrc.close()
if oArgs.bad_contacts and aBadContacts:
# for later analysis
with open(oArgs.bad_contacts, 'wt') as oFYaml:
yaml.dump(aBadContacts, oFYaml)
oFYaml.close()
if oArgs.good_contacts != '' and aTRUST_DB:
with open(good_contacts_tmp, 'wt') as oFYaml:
yaml.dump(aTRUST_DB, oFYaml)
oFYaml.close()
if os.path.exists(oArgs.good_contacts):
bak = oArgs.good_contacts +'.bak'
os.rename(oArgs.good_contacts, bak)
os.rename(good_contacts_tmp, oArgs.good_contacts)
LOG.info(f"Wrote {len(list(aTRUST_DB.keys()))} good contact details to {oArgs.good_contacts}")
oBAD_NODES[oBAD_ROOT]['ExcludeNodes']['BadExit'] = list(texclude_set)
oBAD_NODES[oBAD_ROOT]['ExcludeDomains'] = lKNOWN_NODNS
vwrite_badnodes(oArgs, oBAD_NODES, str(len(texclude_set)))
oGOOD_NODES['GoodNodes']['Relays']['ExitNodes'] = list(aTRUST_DB_INDEX.keys())
# GuardNodes are readonl
vwrite_goodnodes(oArgs, oGOOD_NODES, len(aTRUST_DB_INDEX.keys()))
retval = 0
try:
logging.getLogger('stem').setLevel(30)
try:
if texclude_set:
LOG.info(f"{sEXCLUDE_EXIT_KEY} {len(texclude_set)} net bad exit relays")
controller.set_conf(sEXCLUDE_EXIT_KEY, texclude_set)
except stem.SocketClosed as e:
LOG.error(f"Failed setting {sEXCLUDE_EXIT_KEY} bad exit relays in Tor")
retval += 1
try:
if aTRUST_DB_INDEX.keys():
LOG.info(f"{sINCLUDE_EXIT_KEY} {len(aTRUST_DB_INDEX.keys())} good relays")
controller.set_conf(sINCLUDE_EXIT_KEY, aTRUST_DB_INDEX.keys())
except stem.SocketClosed as e:
LOG.error(f"Failed setting {sINCLUDE_EXIT_KEY} good exit nodes in Tor")
retval += 1
try:
if 'GuardNodes' in oGOOD_NODES[oGOOD_ROOT].keys():
LOG.info(f"{sINCLUDE_GUARD_KEY} {len(oGOOD_NODES[oGOOD_ROOT]['GuardNodes'])} guard nodes")
# FixMe for now override StrictNodes it may be unusable otherwise
controller.set_conf(sINCLUDE_GUARD_KEY,
oGOOD_NODES[oGOOD_ROOT]['GuardNodes'])
cur = controller.get_conf('StrictNodes')
if oArgs.strict_nodes and int(cur) != oArgs.strict_nodes:
LOG.info(f"OVERRIDING StrictNodes to {oArgs.strict_nodes}")
controller.set_conf('StrictNodes', oArgs.strict_nodes)
else:
LOG.info(f"StrictNodes is set to {cur}")
except stem.SocketClosed as e:
LOG.errro(f"Failed setting {sINCLUDE_EXIT_KEY} good exit nodes in Tor")
retval += 1
except InvalidRequest as e:
# Unacceptable option value: Invalid router list.
LOG.error(str(e))
retval = 1
return retval
except KeyboardInterrupt:
return 0
except Exception as e:
LOG.exception(str(e))
retval = 2
return retval
finally:
# wierd we are getting stem errors during the final return
# with a traceback that doesnt correspond to any real flow
# File "/usr/lib/python3.9/site-packages/stem/control.py", line 2474, in set_conf
# self.set_options({param: value}, False)
logging.getLogger('stem').setLevel(40)
try:
for elt in controller._event_listeners:
controller.remove_event_listener(elt)
controller.close()
except Exception as e:
LOG.warn(str(e))
sys.stdout.write("dns-rsa domains:\n" +'\n'.join(tdns_urls) +'\n')
return retval
if __name__ == '__main__':
try:
i = iMain(sys.argv[1:])
except IncorrectPassword as e:
LOG.error(e)
i = 1
except KeyboardInterrupt:
i = 0
except Exception as e:
LOG.exception(e)
i = 2
sys.exit(i)

66
exclude_badExits.txt Normal file
View File

@ -0,0 +1,66 @@
# -*-mode: doctest; tab-width: 0; py-indent-offset: 4; coding: utf-8-unix -*-
== exclude_badExits testing ==
This is a Python doctest file that is executable documentation.
exclude_badExits extends nusenu's basic idea of using the stem library to
dynamically exclude nodes that are likely to be bad by putting them
on the ExcludeNodes or ExcludeExitNodes setting of a running Tor.
* https://github.com/nusenu/noContactInfo_Exit_Excluder
* https://github.com/TheSmashy/TorExitRelayExclude
The basic idea is to exclude Exit nodes that do not have ContactInfo:
* https://github.com/nusenu/ContactInfo-Information-Sharing-Specification
That can be extended to relays that do not have an email in the contact,
or to relays that do not have ContactInfo that is verified to include them.
Pass the controller password if needed as an environment variable:
>>> import os
>>> assert os.environ['TOR_CONTROLLER_PASSWORD']
Add our code to the PYTHONPATH
>>> import sys
>>> sys.path.append(os.path.join(os.getcwd(), 'src', 'exclude_badExits'))
We'll need the settings defined in {{{/usr/local/etc/testforge/testforge.yml}}}
>>> print("yaml", file=sys.stderr)
>>> import yaml
>>> sFacts = open('/usr/local/etc/testforge/testforge.yml').read()
>>> assert sFacts
>>> dFacts = yaml.safe_load(sFacts)
FixMe: use the settings for the ports and directories below.
>>> import os
>>> os.environ['http_proxy'] = 'http://'+dFacts['HTTP_PROXYHOST']+':'+str(dFacts['HTTP_PROXYPORT'])
>>> os.environ['https_proxy'] = 'http://'+dFacts['HTTPS_PROXYHOST']+':'+str(dFacts['HTTPS_PROXYPORT'])
>>> os.environ['socks_proxy'] = 'socks5://'+dFacts['SOCKS_PROXYHOST']+':'+str(dFacts['SOCKS_PROXYPORT'])
Load the module:
>>> print("exclude_badExits", file=sys.stderr)
>>> from exclude_badExits import exclude_badExits
>>> lArgs = ['--help']
Read the usage:
>>> exclude_badExits.iMain(lArgs)
usage: ...
<BLANKLINE>
Torrc to check for suggestions:
>>> lArgs = ['--torrc', '/etc/tor/torrc-defaults']
>>> exclude_badExits.iMain(lArgs)
INFO ...
<BLANKLINE>
This may take a while:
>>> lArgs = ['--proxy_ctl', '9051']
>>> exclude_badExits.iMain(lArgs)

47
pyproject.toml Normal file
View File

@ -0,0 +1,47 @@
[project]
name = "exclude_badExits"
description = "Set the ExcludeNodes or ExcludeExitNodes setting of a running Tor."
authors = [{ name = "emdee", email = "emdee@spm.plastiras.org" } ]
requires-python = ">=3.6"
keywords = ["tor", "python3", "bad exits"]
classifiers = [
"License :: OSI Approved",
"Operating System :: POSIX :: BSD :: FreeBSD",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: Implementation :: CPython",
]
dynamic = ["version", "readme", ] # cannot be dynamic ['license']
scripts = { exclude_badExits = "exclude_badExits.exclude_badExits:iMain" }
dependencies = [
'qasync >= 0.27.1',
'cryptography >= 41.0.7',
'rsa >= 4.9',
'stem >= 1.8.2']
[tool.setuptools.dynamic]
version = {attr = "exclude_badExits.__version__"}
readme = {file = ["README.md"]}
[project.license]
file = "LICENSE.md"
[project.urls]
repository = "https://git.plastiras.org/emdee/exclude_badExits"
[build-system]
requires = ["setuptools >= 61.0"]
build-backend = "setuptools.build_meta"
# Either or both of these don't work
#[tool.setuptools]
#packages = ["exclude_badExits"]
#[tool.setuptools.packages.find]
#include = ["src"]

47
setup.py Normal file
View File

@ -0,0 +1,47 @@
# -*-mode: python; indent-tabs-mode: nil; py-indent-offset: 4; coding: utf-8 -*
import re
from setuptools import setup, find_packages
with open("qasync/__init__.py") as f:
version = re.search(r'__version__\s+=\s+"(.*)"', f.read()).group(1)
long_description = "\n\n".join([
open("README.md").read(),
])
if __name__ == '__main__':
setup(
name="exclude_badExits",
version=__version__,
description="""A program to exclude bad exits on the Tor network""",
long_description=long_description,
author="Nusenu (originally)",
author_email='',
license="1clause BSD",
packages = find_packages(exclude=['test*']),
# url="",
# download_url="https://",
keywords=['exit nodes', 'Tor', 'tor onion controller'],
# maybe less - nothing fancy
python_requires="~=3.6",
# probably works on PyQt6 and PySide2 but untested
# https://github.com/CabbageDevelopment/qasync/
install_requires=['cryptography',
'rsa',
'stem',
'urllib3',
'yaml'],
entry_points={
'console_scripts': ['exclude_badExits = exclude_badExits.__main__:iMain', ]},
classifiers=[
'Development Status :: 4 - Beta',
'Environment :: Console',
'Intended Audience :: Developers',
'Natural Language :: English',
'Operating System :: OS Independent',
'Programming Language :: Python :: 3',
'Topic :: Security',
'Topic :: Software Development :: Libraries :: Python Modules',
],
)

0
src/__init__.py Normal file
View File

View File

@ -0,0 +1,2 @@
__version__ = "1.0.0"

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,412 @@
#!/usr/local/bin/python3.sh
# -*- mode: python; indent-tabs-mode: nil; py-indent-offset: 4; coding: utf-8 -*-
import argparse
from argparse import Namespace
import os
import sys
import re
from io import StringIO
import logging
import warnings
global LOG
from support_onions import (oGetStemController,
vwait_for_controller,)
try:
from ruamel.yaml import YAML
yaml = YAML(typ='rt')
yaml.indent(mapping=2, sequence=2)
safe_load = yaml.load
except:
yaml = None
if yaml is None:
try:
import yaml
safe_load = yaml.safe_load
except:
yaml = None
try:
# if 'COLOREDLOGS_LEVEL_STYLES' not in os.environ:
# os.environ['COLOREDLOGS_LEVEL_STYLES'] = 'spam=22;debug=28;verbose=34;notice=220;warning=202;success=118,bold;error=124;critical=background=red'
# https://pypi.org/project/coloredlogs/
import coloredlogs
except ImportError:
coloredlogs = False
def aCleanContact(a, lAT_REPS, lDOT_REPS, lNO_EMAIL) -> dict:
# cleanups
for elt in lINTS:
if elt in a:
a[elt] = int(a[elt])
for elt in lBOOLS:
if elt not in a: continue
if a[elt] in ['y', 'yes', 'true', 'True']:
a[elt] = True
else:
a[elt] = False
for elt in lEMAILS:
if elt not in a: continue
a[elt] = sCleanEmail(a[elt], lAT_REPS, lDOT_REPS, lNO_EMAIL)
if 'url' in a.keys():
a['url'] = a['url'].rstrip('/')
if a['url'].startswith('http://'):
domain = a['url'].replace('http://', '')
elif a['url'].startswith('https://'):
domain = a['url'].replace('https://', '')
else:
domain = a['url']
a['url'] = 'https://' + domain
a.update({'fps': []})
return a
def sCleanEmail(s, lAT_REPS, lDOT_REPS, lNO_EMAIL) -> str:
s = s.lower()
for elt in lAT_REPS:
if not elt.startswith(' '):
s = s.replace(' ' + elt + ' ', '@')
s = s.replace(elt, '@')
for elt in lDOT_REPS:
if not elt.startswith(' '):
s = s.replace(' ' + elt + ' ', '.')
s = s.replace(elt, '.')
s = s.replace('(dash)', '-')
s = s.replace('hyphen ', '-')
for elt in lNO_EMAIL:
s = s.replace(elt, '?')
return s
lMORONS = ['hoster:Quintex Alliance Consulting ']
oCONTACT_RE = re.compile(r'([^:]*)(\s+)(email|url|proof|ciissversion|abuse|gpg):')
lINTS = ['ciissversion', 'uplinkbw', 'signingkeylifetime', 'memory']
lBOOLS = ['dnssec', 'dnsqname', 'aesni', 'autoupdate', 'dnslocalrootzone',
'sandbox', 'offlinemasterkey']
lEMAILS = ['abuse', 'email']
ETC_DIR = '/usr/local/etc/tor/yaml'
def oStemController(oargs, sEXCLUDE_EXIT_GROUP):
if os.path.exists(oargs.proxy_ctl):
controller = oGetStemController(log_level=oargs.log_level,
sock_or_pair=oargs.proxy_ctl,
password=oargs.torctl_pass)
else:
port =int(oargs.proxy_ctl)
controller = oGetStemController(log_level=oargs.log_level,
sock_or_pair=port,
password=oargs.torctl_pass)
vwait_for_controller(controller, oargs.wait_boot)
elt = controller.get_conf('UseMicrodescriptors')
if elt != '0':
LOG.error('"UseMicrodescriptors 0" is required in your /etc/tor/torrc. Exiting.')
controller.set_conf('UseMicrodescriptors', 0)
# does it work dynamically?
return 2
elt = controller.get_conf(sEXCLUDE_EXIT_GROUP)
if elt and elt != '{??}':
LOG.warn(f"{sEXCLUDE_EXIT_GROUP} is in use already")
return controller
def aParseContactYaml(contact, fp) -> dict:
"""
See the Tor ContactInfo Information Sharing Specification v2
https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/
"""
l = [line for line in contact.strip().replace('"', '').split(' ')
if ':' in line]
LOG.debug(f"{fp} {len(l)} fields")
s = f'"{fp}":\n'
s += '\n'.join([f" {line}\"".replace(':', ': \"', 1)
for line in l])
oFd = StringIO(s)
a = safe_load(oFd)
return a
def aParseContact(contact, fp, lAT_REPS, lDOT_REPS, lNO_EMAIL) -> dict:
"""
See the Tor ContactInfo Information Sharing Specification v2
https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/
"""
a = {}
if not contact:
LOG.warn(f"BC null contact for {fp}")
LOG.debug(f"{fp} {contact}")
return {}
contact = contact.split(r'\n')[0]
for elt in lMORONS:
contact = contact.replace(elt, '')
m = oCONTACT_RE.match(contact)
# 450 matches!
if m and m.groups and len(m.groups(0)) > 2 and m.span()[1] > 0:
i = len(m.groups(0)[0]) + len(m.groups(0)[1])
contact = contact[i:]
# shlex?
lelts = contact.split(' ')
if not lelts:
LOG.warn(f"BC empty contact for {fp}")
LOG.debug(f"{fp} {contact}")
return {}
for elt in lelts:
if ':' not in elt:
# hoster:Quintex Alliance Consulting
LOG.warn(f"BC no : in {elt} for {contact} in {fp}")
# return {}
# try going with what we have
break
(key , val,) = elt.split(':', 1)
if key == '':
continue
key = key.rstrip(':')
a[key] = val
a = aCleanContact(a, lAT_REPS, lDOT_REPS, lNO_EMAIL)
return a
def vwrite_good_contacts(oargs, aGOOD_CONTACTS_DB) -> None:
good_contacts_tmp = oargs.good_contacts + '.tmp'
with open(good_contacts_tmp, 'wt') as oFYaml:
yaml.dump(aGOOD_CONTACTS_DB, oFYaml)
oFYaml.close()
if os.path.exists(oargs.good_contacts):
bak = oargs.good_contacts +'.bak'
os.rename(oargs.good_contacts, bak)
os.rename(good_contacts_tmp, oargs.good_contacts)
LOG.info(f"Wrote {len(list(aGOOD_CONTACTS_DB.keys()))} good contact details to {oargs.good_contacts}")
def vwrite_bad_contacts(oargs, aBAD_CONTACTS_DB) -> None:
bad_contacts_tmp = oargs.bad_contacts + '.tmp'
with open(bad_contacts_tmp, 'wt') as oFYaml:
yaml.dump(aBAD_CONTACTS_DB, oFYaml)
oFYaml.close()
if os.path.exists(oargs.bad_contacts):
bak = oargs.bad_contacts +'.bak'
os.rename(oargs.bad_contacts, bak)
os.rename(bad_contacts_tmp, oargs.bad_contacts)
LOG.info(f"Wrote {len(list(aBAD_CONTACTS_DB.keys()))} bad contact details to {oargs.bad_contacts}")
def vwrite_badnodes(oargs, aBAD_NODES, slen, stag) -> None:
if not aBAD_NODES: return
tmp = oargs.bad_nodes +'.tmp'
bak = oargs.bad_nodes +'.bak'
with open(tmp, 'wt') as oFYaml:
yaml.dump(aBAD_NODES, oFYaml)
LOG.info(f"Wrote to {oargs.bad_nodes}")
oFYaml.close()
if os.path.exists(oargs.bad_nodes):
os.rename(oargs.bad_nodes, bak)
os.rename(tmp, oargs.bad_nodes)
def vwrite_goodnodes(oargs, aGOOD_NODES, ilen) -> None:
tmp = oargs.good_nodes +'.tmp'
bak = oargs.good_nodes +'.bak'
with open(tmp, 'wt') as oFYaml:
yaml.dump(aGOOD_NODES, oFYaml)
LOG.info(f"Wrote good relays to {oargs.good_nodes}")
oFYaml.close()
if os.path.exists(oargs.good_nodes):
os.rename(oargs.good_nodes, bak)
os.rename(tmp, oargs.good_nodes)
def vwritefinale(oargs, lNOT_IN_RELAYS_DB) -> None:
if len(lNOT_IN_RELAYS_DB):
LOG.warn(f"{len(lNOT_IN_RELAYS_DB)} relays from stem were not in onionoo.torproject.org")
LOG.info(f"For info on a FP, use: https://nusenu.github.io/OrNetStats/w/relay/<FP>.html")
LOG.info(f"For info on relays, try: https://onionoo.torproject.org/details")
# https://onionoo.torproject.org/details
def alt_vsetup_logging(theLOG, log_level, logfile='', stream=sys.stderr) -> None:
global LOG
LOG = theLOG
add = True
logging._defaultFormatter = logging.Formatter(datefmt='%m-%d %H:%M:%S')
logging._defaultFormatter.default_time_format = '%m-%d %H:%M:%S'
logging._defaultFormatter.default_msec_format = ''
if logfile:
add = logfile.startswith('+')
sub = logfile.startswith('-')
if add or sub:
logfile = logfile[1:]
kwargs['filename'] = logfile
if coloredlogs:
coloredlogs.DEFAULT_LEVEL_STYLES['info']=dict(color='white',bold=True)
coloredlogs.DEFAULT_LEVEL_STYLES['debug']=dict(color='cyan')
coloredlogs.DEFAULT_LEVEL_STYLES['warn']=dict(color='yellow',bold=True)
coloredlogs.DEFAULT_LEVEL_STYLES['error']=dict(color='red',bold=True)
coloredlogs.DEFAULT_FIELD_STYLES['levelname=']=dict(color='green', bold=True),
# https://pypi.org/project/coloredlogs/
aKw = dict(level=log_level,
logger=LOG,
stream=stream,
fmt='%(levelname)s %(message)s',
isatty=True,
milliseconds=False,
)
coloredlogs.install(**aKw)
if logfile:
oHandler = logging.FileHandler(logfile)
LOG.addHandler(oHandler)
LOG.info(f"Setting coloured log_level to {log_level} {stream}")
else:
kwargs = dict(level=log_level,
force=True,
format='%(levelname)s %(message)s')
logging.basicConfig(**kwargs)
if add and logfile:
oHandler = logging.StreamHandler(stream)
LOG.addHandler(oHandler)
LOG.info(f"SSetting log_level to {log_level!s}")
def vsetup_logging(theLOG, log_level, logfile='', stream=sys.stdout) -> None:
global LOG
LOG = theLOG
add = True
# stem fucks up logging
# from stem.util import log
logging.getLogger('stem').setLevel(30)
logging._defaultFormatter = logging.Formatter(datefmt='%m-%d %H:%M:%S')
logging._defaultFormatter.default_time_format = '%m-%d %H:%M:%S'
logging._defaultFormatter.default_msec_format = ''
if logfile:
add = logfile.startswith('+')
sub = logfile.startswith('-')
if add or sub:
logfile = logfile[1:]
kwargs['filename'] = logfile
if coloredlogs:
# https://pypi.org/project/coloredlogs/
coloredlogs.install(
level=log_level,
logger=LOG,
stream=stream,
fmt='%(levelname)s %(message)s',
isatty=True, # required!
milliseconds=False,
)
if logfile:
oHandler = logging.FileHandler(logfile)
LOG.addHandler(oHandler)
LOG.info(f"Setting coloured log_level to {log_level} {stream}")
else:
kwargs = dict(level=log_level,
force=True,
format='%(levelname)s %(message)s')
logging.basicConfig(**kwargs)
if add and logfile:
oHandler = logging.StreamHandler(stream)
LOG.addHandler(oHandler)
LOG.info(f"Setting log_level to {log_level}")
def oMainArgparser(_=None, __prolog__='') -> Namespace:
try:
from OpenSSL import SSL
lCAfs = SSL._CERTIFICATE_FILE_LOCATIONS
except:
lCAfs = []
CAfs = []
for elt in lCAfs:
if os.path.exists(elt):
CAfs.append(elt)
if not CAfs:
CAfs = ['']
parser = argparse.ArgumentParser(add_help=True,
epilog=__prolog__)
# important settings
parser.add_argument('--bad_on', type=str, default='Empty,NoEmail,NotGood',
help="comma sep list of conditions - Empty,NoEmail,NotGood")
parser.add_argument('--points_timeout', type=int, default=0,
help="Timeout for getting introduction points - must be long >120sec. 0 means disabled looking for IPs")
parser.add_argument('--saved_only', default=False,
action='store_true',
help="Just use the info in the last *.yaml files without querying the Tor controller")
parser.add_argument('--hs_dir', type=str,
default='/var/lib/tor',
help="Parse the files name hostname below this dir to find Hidden Services to whitelist")
parser.add_argument('--notice_log', type=str,
default='',
help="Parse the notice log for relays and services")
parser.add_argument('--strict_nodes', type=str, default=0,
choices=['0', '1'],
help="Set StrictNodes: 1 is less anonymous but more secure, although some onion sites may be unreachable")
# proxy
parser.add_argument('--proxy_host', '--proxy-host', type=str,
default='127.0.0.1',
help='proxy host')
parser.add_argument('--proxy_port', '--proxy-port', default=9050, type=int,
help='proxy socks port')
parser.add_argument('--proxy_ctl', '--proxy-ctl',
default='/run/tor/control' if os.path.exists('/run/tor/control') else '9051',
type=str,
help='control socket - or port')
parser.add_argument('--torctl_pass',
default=os.environ.get('TOR_CONTROLLER_PASSWORD', ''),
type=str,
help='password for the tor controller')
parser.add_argument('--torrc',
default='/etc/tor/torrc-defaults',
type=str,
help='torrc to check for suggestions')
# output
parser.add_argument('--torrc_output', type=str,
default=os.path.join(ETC_DIR, 'torrc.new'),
help="Write the torrc configuration to a file")
parser.add_argument('--good_nodes', type=str,
default=os.path.join(ETC_DIR, 'goodnodes.yaml'),
help="Yaml file of good info that should not be excluded")
parser.add_argument('--bad_nodes', type=str,
default=os.path.join(ETC_DIR, 'badnodes.yaml'),
help="Yaml file of bad nodes that should also be excluded")
parser.add_argument('--bad_contacts', type=str,
default=os.path.join(ETC_DIR, 'badcontacts.yaml'),
help="Yaml file of bad contacts that bad FPs are using")
parser.add_argument('--relays_output', type=str,
default=os.path.join(ETC_DIR, 'relays.json'),
help="Write the download relays in json to a file")
parser.add_argument('--wellknown_output', type=str,
default=os.path.join(ETC_DIR, 'https'),
help="Write the well-known files to a directory")
parser.add_argument('--good_contacts', type=str, default=os.path.join(ETC_DIR, 'goodcontacts.yaml'),
help="Write the proof data of the included nodes to a YAML file")
# timeouts
parser.add_argument('--timeout', default=60, type=int,
help='proxy download connect timeout')
parser.add_argument('--wait_boot', type=int, default=120,
help="Seconds to wait for Tor to booststrap")
parser.add_argument('--https_cafile', type=str,
help="Certificate Authority file (in PEM)",
default=CAfs[0])
parser.add_argument('--log_level', type=int, default=20,
help="10=debug 20=info 30=warn 40=error")
parser.add_argument('--bad_sections', type=str,
default='BadExit',
help="sections of the badnodes.yaml to use, in addition to BadExit, comma separated")
parser.add_argument('--white_onions', type=str,
default='',
help="comma sep. list of onions to whitelist their introduction points - BROKEN")
return parser

View File

@ -1,87 +1,255 @@
# -*- mode: python; indent-tabs-mode: nil; py-indent-offset: 4; coding: utf-8 -*-
import getpass
import os
import sys
import re
import traceback
import select
import shutil
import socket
import select
import sys
import time
import getpass
if False:
import cepa as stem
from cepa.control import Controller
from cepa.connection import MissingPassword
from cepa.control import Controller
from cepa.util.tor_tools import is_valid_fingerprint
else:
import stem
from stem.control import Controller
from stem.connection import MissingPassword
from stem.control import Controller
from stem.util.tor_tools import is_valid_fingerprint
global LOG
import logging
import warnings
warnings.filterwarnings('ignore')
LOG = logging.getLogger()
bHAVE_TORR = shutil.which('tor-resolve')
yKNOWN_ONIONS = """
- facebookwkhpilnemxj7asaniu7vnjjbiltxjqhye3mhbshg7kx5tfyd # facebook
- duckduckgogg42xjoc72x3sjasowoarfbgcmvfimaftt6twagswzczad # ddg
- zkaan2xfbuxia2wpf7ofnkbz6r5zdbbvxbunvp5g2iebopbfc4iqmbad # hks
"""
# grep -B 1 '<li><a href="' /tmp/tor.html |sed -e 's/<li><a href="http:../ - /' -e 's/.onion.*//' -e 's/<li id=./ # /' -e 's/".*//' -e '/^--/d' -e '/<li id/d'
# This will slow things down 1-2 min each
yKNOWN_ONIONS_TOR = """
# 2019.www.torproject.org
- jqyzxhjk6psc6ul5jnfwloamhtyh7si74b4743k2qgpskwwxrzhsxmad
# api.donate.torproject.org
- rbi3fpvpz4vlrx67scoqef2zxz7k4xyiludszg655favvkygjmhz6wyd
# archive.torproject.org
- uy3qxvwzwoeztnellvvhxh7ju7kfvlsauka7avilcjg7domzxptbq7qd
# aus1.torproject.org
- ot3ivcdxmalbsbponeeq5222hftpf3pqil24q3s5ejwo5t52l65qusid
# aus2.torproject.org
- b5t7emfr2rn3ixr4lvizpi3stnni4j4p6goxho7lldf4qg4hz5hvpqid
# blog.torproject.org
- pzhdfe7jraknpj2qgu5cz2u3i4deuyfwmonvzu5i3nyw4t4bmg7o5pad
# bridges.torproject.org
- yq5jjvr7drkjrelzhut7kgclfuro65jjlivyzfmxiq2kyv5lickrl4qd
# cloud.torproject.org
- ui3cpcohcoko6aydhuhlkwqqtvadhaflcc5zb7mwandqmcal7sbwzwqd
# collector.torproject.org
- pgmrispjerzzf2tdzbfp624cg5vpbvdw2q5a3hvtsbsx25vnni767yad
# collector2.torproject.org
- 3srlmjzbyyzz62jvdfqwn5ldqmh6mwnqxre2zamxveb75uz2qrqkrkyd
# community.torproject.org
- xmrhfasfg5suueegrnc4gsgyi2tyclcy5oz7f5drnrodmdtob6t2ioyd
# consensus-health.torproject.org
- tkskz5dkjel4xqyw5d5l3k52kgglotwn6vgb5wrl2oa5yi2szvywiyid
# crm.torproject.org
- 6ojylpznauimd2fga3m7g24vd7ebkzlemxdprxckevqpzs347ugmynqd
# deb.torproject.org
- apow7mjfryruh65chtdydfmqfpj5btws7nbocgtaovhvezgccyjazpqd
# dev.crm.torproject.org
- eewp4iydzyu2a5d6bvqadadkozxdbhsdtmsoczu2joexfrjjsheaecad
# dist.torproject.org
- scpalcwstkydpa3y7dbpkjs2dtr7zvtvdbyj3dqwkucfrwyixcl5ptqd
# donate-api.torproject.org
- lkfkuhcx62yfvzuz5o3ju4divuf4bsh2bybwd3oierq47kyp2ig2gvid
# donate.torproject.org
- yoaenchicimox2qdc47p36zm3cuclq7s7qxx6kvxqaxjodigfifljqqd
# exonerator.torproject.org
- pm46i5h2lfewyx6l7pnicbxhts2sxzacvsbmqiemqaspredf2gm3dpad
# extra.torproject.org
- kkr72iohlfix5ipjg776eyhplnl2oiv5tz4h2y2bkhjix3quafvjd5ad
# gettor.torproject.org
- ueghr2hzndecdntou33mhymbbxj7pir74nwzhqr6drhxpbz3j272p4id
# git.torproject.org
- xtlfhaspqtkeeqxk6umggfbr3gyfznvf4jhrge2fujz53433i2fcs3id
# gitlab.torproject.org
- eweiibe6tdjsdprb4px6rqrzzcsi22m4koia44kc5pcjr7nec2rlxyad
# gitweb.torproject.org
- gzgme7ov25seqjbphab4fkcph3jkobfwwpivt5kzbv3kqx2y2qttl4yd
# grafana1.torproject.org
- 7zjnw5lx2x27rwiocxkqdquo7fawj46mf2wiu2l7e6z6ng6nivmdxnad
# grafana2.torproject.org
- f3vd6fyiccuppybkxiblgigej3pfvvqzjnhd3wyv7h4ee5asawf2fhqd
# ircbouncer.torproject.org
- moz5kotsnjony4oxccxfo4lwk3pvoxmdoljibhgoonzgzjs5oemtjmqd
# metabase.metrics.torproject.org
- gr5pseamigereei4c6654hafzhid5z2c3oqzn6cfnx7yfyelt47znhad
# metrics.torproject.org
- hctxrvjzfpvmzh2jllqhgvvkoepxb4kfzdjm6h7egcwlumggtktiftid
# moat.torproject.org
- z7m7ogzdhu43nosvjtsuplfmuqa3ge5obahixydhmzdox6owwxfoxzid
# nagios.torproject.org
- w6vizvw4ckesva5fvlkrepynemxdq6pgo5sh4r76ec6msq5notkhqryd
# newsletter.torproject.org
- a4ygisnerpgtc5ayerl22pll6cls3oyj54qgpm7qrmb66xrxts6y3lyd
# nightlies.tbb.torproject.org
- umj4zbqdfcyevlkgqgpq6foxk3z75zzxsbgt5jqmfxofrbrjh3crbnad
# nyx.torproject.org
- 3ewfgrt4gzfccp6bnquhqb266r3zepiqpnsk3falwygkegtluwuyevid
- xao2lxsmia2edq2n5zxg6uahx6xox2t7bfjw6b5vdzsxi7ezmqob6qid
- dud2sxm6feahhuwj4y4lzktduy7v3qpaqsfkggtj2ojmzathttkegoid
# openpgpkey.torproject.org
- 2yldcptk56shc7lwieozoglw3t5ghty7m6mf2faysvfnzccqavbu2mad
# people.torproject.org
- 5ecey6oe4rocdsfoigr4idu42cecm2j7zfogc3xc7kfn4uriehwrs6qd
# prometheus1.torproject.org
- ydok5jiruh3ak6hcfdlm2g7iuraaxcomeckj2nucjsxif6qmrrda2byd
# prometheus2.torproject.org
- vyo6yrqhl3by7d6n5t6hjkflaqbarjpqjnvapr5u5rafk4imnfrmcjyd
# rbm.torproject.org
- nkuz2tpok7ctwd5ueer5bytj3bm42vp7lgjcsnznal3stotg6vyaakyd
# research.torproject.org
- xhqthou6scpfnwjyzc3ekdgcbvj76ccgyjyxp6cgypxjlcuhnxiktnqd
# review.torproject.net
- zhkhhhnppc5k6xju7n25rjba3wuip73jnodicxl65qdpchrwvvsilcyd
# rpm.torproject.org
- 4ayyzfoh5qdrokqaejis3rdredhvf22n3migyxfudpkpunngfc7g4lqd
# snowflake.torproject.org
- oljlphash3bpqtrvqpr5gwzrhroziw4mddidi5d2qa4qjejcbrmoypqd
# spec.torproject.org
- i3xi5qxvbrngh3g6o7czwjfxwjzigook7zxzjmgwg5b7xnjcn5hzciad
# staging-api.donate.torproject.org
- vorwws6g6mx23djlznmlqva4t5olulpnet6fxyiyytcu5dorp3fstdqd
# staging.crm.torproject.org
- pt34uujusar4arrvsqljndqlt7tck2d5cosaav5xni4nh7bmvshyp2yd
# staging.donate-api.torproject.org
- 7niqsyixinnhxvh33zh5dqnplxnc2yd6ktvats3zmtbbpzcphpbsa6qd
# status.torproject.org
- eixoaclv7qvnmu5rolbdwba65xpdiditdoyp6edsre3fitad777jr3ad
# stem.torproject.org
- mf34jlghauz5pxjcmdymdqbe5pva4v24logeys446tdrgd5lpsrocmqd
# styleguide.torproject.org
- 7khzpw47s35pwo3lvtctwf2szvnq3kgglvzc22elx7of2awdzpovqmqd
# submission.torproject.org
- givpjczyrb5jjseful3o5tn3tg7tidbu4gydl4sa5ekpcipivqaqnpad
# support.torproject.org
- rzuwtpc4wb3xdzrj3yeajsvm3fkq4vbeubm2tdxaqruzzzgs5dwemlad
# survey.torproject.org
- eh5esdnd6fkbkapfc6nuyvkjgbtnzq2is72lmpwbdbxepd2z7zbgzsqd
# svn-archive.torproject.org
- b63iq6es4biaawfilwftlfkw6a6putogxh4iakei2ioppb7dsfucekyd
# tb-manual.torproject.org
- dsbqrprgkqqifztta6h3w7i2htjhnq7d3qkh3c7gvc35e66rrcv66did
# test-api.donate.torproject.org
- wiofesr5qt2k7qrlljpk53isgedxi6ddw6z3o7iay2l7ne3ziyagxaid
# test-data.tbb.torproject.org
- umbk3kbgov4ekg264yulvbrpykfye7ohguqbds53qn547mdpt6o4qkad
# test.crm.torproject.org
- a4d52y2erv4eijii66cpnyqn7rsnnq3gmtrsdxzt2laoutvu4gz7fwid
# test.donate-api.torproject.org
- i4zhrn4md3ucd5dfgeo5lnqd3jy2z2kzp3lt4tdisvivzoqqtlrymkid
# www
- tttyx2vwp7ihml3vkhywwcizv6nbwrikpgeciy3qrow7l7muak2pnhad
# www.torproject.org
- 2gzyxa5ihm7nsggfxnu52rck2vv4rvmdlkiu3zzui5du4xyclen53wid
"""
# we check these each time but we got them by sorting bad relays
# in the wild we'll keep a copy here so we can avoid restesting
yKNOWN_NODNS = """
---
- 0x0.is
- a9.wtf
- aklad5.com
- artikel5ev.de
- arvanode.net
- dodo.pm
- dra-family.github.io
- eraldonion.org
- erjan.net
- apt96.com
- axims.net
- backup.spekadyon.org
- dfri.se
- dotsrc.org
- dtf.contact
- ezyn.de
- for-privacy.net
- galtland.network
- ineapple.cx
- lonet.sh
- moneneis.de
- olonet.sh
- or-exit-2.aa78i2efsewr0neeknk.xyz
- or.wowplanet.de
- ormycloud.org
- plied-privacy.net
- rivacysvcs.net
- redacted.org
- rification-for-nusenu.net
- rofl.cat
- rsv.ch
- sv.ch
- heraldonion.org
- interfesse.net
- kryptonit.org
- linkspartei.org
- mkg20001.io
- nicdex.com
- nx42.de
- pineapple.cx
- privacylayer.xyz
- privacysvcs.net
- prsv.ch
- sebastian-elisa-pfeifer.eu
- thingtohide.nl
- tikel10.org
- tor.wowplanet.de
- tor-exit-2.aa78i2efsewr0neeknk.xyz
- tor-exit-3.aa78i2efsewr0neeknk.xyz
- torix-relays.org
- tse.com
- tor.dlecan.com
- tor.skankhunt42.pw
- transliberation.today
- tuxli.org
- w.digidow.eu
- w.cccs.de
- unzane.com
- verification-for-nusenu.net
- www.defcon.org
"""
# - aklad5.com
# - artikel5ev.de
# - arvanode.net
# - dodo.pm
# - erjan.net
# - galtland.network
# - lonet.sh
# - moneneis.de
# - olonet.sh
# - or-exit-2.aa78i2efsewr0neeknk.xyz
# - or.wowplanet.de
# - ormycloud.org
# - plied-privacy.net
# - rivacysvcs.net
# - redacted.org
# - rofl.cat
# - sv.ch
# - tikel10.org
# - tor.wowplanet.de
# - torix-relays.org
# - tse.com
# - w.digidow.eu
# - w.cccs.de
def oMakeController(sSock='', port=9051):
import getpass
if sSock and os.path.exists(sSock):
controller = Controller.from_socket_file(path=sSock)
else:
controller = Controller.from_port(port=port)
sys.stdout.flush()
p = getpass.unix_getpass(prompt='Controller Password: ', stream=sys.stderr)
controller.authenticate(p)
return controller
oSTEM_CONTROLER = None
def oGetStemController(log_level=10, sock_or_pair='/run/tor/control'):
def oGetStemController(log_level=10, sock_or_pair='/run/tor/control', password=None):
global oSTEM_CONTROLER
if oSTEM_CONTROLER: return oSTEM_CONTROLER
from stem.util.log import Runlevel
Runlevel = log_level
import stem.util.log
# stem.util.log.Runlevel = 'DEBUG' = 20 # log_level
if os.path.exists(sock_or_pair):
LOG.info(f"controller from socket {sock_or_pair}")
controller = Controller.from_socket_file(path=sock_or_pair)
else:
if ':' in sock_or_pair:
if type(sock_or_pair) == int:
port = sock_or_pair
elif ':' in sock_or_pair:
port = sock_or_pair.split(':')[1]
else:
port = sock_or_pair
@ -90,8 +258,28 @@ def oGetStemController(log_level=10, sock_or_pair='/run/tor/control'):
except: port = 9051
LOG.info(f"controller from port {port}")
controller = Controller.from_port(port=port)
if password is None:
password = os.environ.get('TOR_CONTROLLER_PASSWORD', '')
print(f"DBUG: trying TOR_CONTROLLER_PASSWORD {len(password)}")
else:
# print(f"DBUG: using a password {len(password)}")
pass
if not password:
# print("DBUG: trying without a password")
try:
controller.authenticate()
oSTEM_CONTROLER = controller
return controller
except MissingPassword as e:
pass # drop throuhgh
except Exception as e:
print(f"WARN: error trying to authenticate {e}")
#? return None
sys.stdout.flush()
password = getpass.unix_getpass(prompt='Controller Password: ', stream=sys.stderr)
try:
controller.authenticate()
controller.authenticate(password)
except (Exception, MissingPassword):
sys.stdout.flush()
p = getpass.unix_getpass(prompt='Controller Password: ', stream=sys.stderr)
@ -111,22 +299,6 @@ def bAreWeConnected():
i += 1
return i > 0
def sMapaddressResolv(target, iPort=9051, log_level=10):
if not stem:
LOG.warn('please install the stem Python package')
return ''
try:
controller = oGetStemController(log_level=log_level)
map_dict = {"0.0.0.0": target}
map_ret = controller.map_address(map_dict)
return map_ret
except Exception as e:
LOG.exception(e)
return ''
def vwait_for_controller(controller, wait_boot=10):
if bAreWeConnected() is False:
raise SystemExit("we are not connected")
@ -145,7 +317,8 @@ def bin_to_hex(raw_id, length=None):
res = ''.join('{:02x}'.format(raw_id[i]) for i in range(length))
return res.upper()
def lIntroductionPoints(controller=None, lOnions=[], itimeout=120, log_level=10):
def lIntroductionPoints(controller=None, lOnions=[], itimeout=120, log_level=10,
password=None):
"""now working !!! stem 1.8.x timeout must be huge >120
'Provides the descriptor for a hidden service. The **address** is the
'.onion' address of the hidden service '
@ -154,22 +327,25 @@ def lIntroductionPoints(controller=None, lOnions=[], itimeout=120, log_level=10)
try:
from cryptography.utils import int_from_bytes
except ImportError:
import cryptography.utils
# guessing - not in the current cryptography but stem expects it
def int_from_bytes(**args): return int.to_bytes(*args)
cryptography.utils.int_from_bytes = int_from_bytes
# this will fai if the trick above didnt work
from stem.prereq import is_crypto_available
is_crypto_available(ed25519 = True)
is_crypto_available(ed25519=True)
from stem.descriptor.hidden_service import HiddenServiceDescriptorV3
from stem.client.datatype import LinkByFingerprint
from stem import Timeout
from queue import Empty
from stem import Timeout
from stem.client.datatype import LinkByFingerprint
from stem.descriptor.hidden_service import HiddenServiceDescriptorV3
if type(lOnions) not in [set, tuple, list]:
lOnions = list(lOnions)
if controller is None:
controller = oGetStemController(log_level=log_level)
controller = oGetStemController(log_level=log_level, password=password)
l = []
for elt in lOnions:
LOG.info(f"controller.get_hidden_service_descriptor {elt}")
@ -194,15 +370,16 @@ def lIntroductionPoints(controller=None, lOnions=[], itimeout=120, log_level=10)
for introduction_point in n:
for linkspecifier in introduction_point.link_specifiers:
if isinstance(linkspecifier, LinkByFingerprint):
# LOG.log(40, f"Getting fingerprint for {linkspecifier}")
# LOG.log(40, f"Getting fingerprint for {linkspecifier}")
if hasattr(linkspecifier, 'fingerprint'):
assert len(linkspecifier.value) == 20
lp += [bin_to_hex(linkspecifier.value)]
LOG.info(f"{len(lp)} introduction points for {elt}")
l += lp
except (Empty, Timeout, ) as e:
except (Empty, Timeout,) as e: # noqa
LOG.warn(f"Timed out getting introduction points for {elt}")
continue
except stem.DescriptorUnavailable as e:
LOG.error(e)
except Exception as e:
LOG.exception(e)
return l
@ -210,13 +387,13 @@ def lIntroductionPoints(controller=None, lOnions=[], itimeout=120, log_level=10)
def zResolveDomain(domain):
try:
ip = sTorResolve(domain)
except Exception as e:
except Exception as e: # noqa
ip = ''
if ip == '':
try:
lpair = getaddrinfo(domain, 443)
except Exception as e:
LOG.warn("{e}")
LOG.warn(f"{e}")
lpair = None
if lpair is None:
LOG.warn(f"TorResolv and getaddrinfo failed for {domain}")
@ -233,13 +410,12 @@ def sTorResolve(target,
):
MAX_INFO_RESPONSE_PACKET_LENGTH = 8
if '@' in target:
LOG.warn(f"sTorResolve failed invalid hostname {target}" )
LOG.warn(f"sTorResolve failed invalid hostname {target}")
return ''
target = target.strip('/')
seb = b"\o004\o360\o000\o000\o000\o000\o000\o001\o000"
seb = b"\x04\xf0\x00\x00\x00\x00\x00\x01\x00"
seb += bytes(target, 'US-ASCII') + b"\x00"
assert len(seb) == 10+len(target), str(len(seb))+repr(seb)
assert len(seb) == 10 + len(target), str(len(seb)) + repr(seb)
# LOG.debug(f"0 Sending {len(seb)} to The TOR proxy {seb}")
@ -247,7 +423,7 @@ def sTorResolve(target,
sock.connect((sHost, iPort))
sock.settimeout(SOCK_TIMEOUT_SECONDS)
oRet = sock.sendall(seb)
oRet = sock.sendall(seb) # noqa
i = 0
data = ''
@ -261,8 +437,7 @@ def sTorResolve(target,
flags=socket.MSG_WAITALL
data = sock.recv(MAX_INFO_RESPONSE_PACKET_LENGTH, flags)
except socket.timeout:
LOG.warn("4 The TOR proxy " \
+repr((sHost, iPort)) \
LOG.warn(f"4 The TOR proxy {(sHost, iPort)}" \
+" didnt reply in " + str(SOCK_TIMEOUT_SECONDS) + " sec."
+" #" +str(i))
except Exception as e:
@ -271,7 +446,7 @@ def sTorResolve(target,
+" errored with " + str(e)
+" #" +str(i))
sock.close()
raise SystemExit(4)
return ''
else:
if len(data) > 0: break
@ -280,9 +455,9 @@ def sTorResolve(target,
sLabel = "5 No reply #"
else:
sLabel = "5 No data #"
LOG.info(sLabel +f"{i} on {sHost}:{iPort}" )
LOG.warn(f"sTorResolve: {sLabel} {i} on {sHost}:{iPort}")
sock.close()
raise SystemExit(5)
return ''
assert len(data) >= 8
packet_sf = data[1]
@ -292,7 +467,7 @@ def sTorResolve(target,
return f"{data[4]}.{data[5]}.{data[6]}.{data[7]}"
else:
# 91
LOG.warn(f"tor-resolve failed for {target} on {sHost}:{iPort}" )
LOG.warn(f"tor-resolve failed for {target} on {sHost}:{iPort}")
os.system(f"tor-resolve -4 {target} > /tmp/e 2>/dev/null")
# os.system("strace tor-resolve -4 "+target+" 2>&1|grep '^sen\|^rec'")
@ -316,13 +491,30 @@ def getaddrinfo(sHost, sPort):
return None
return lPair
# unused?
def sMapaddressResolv(target, iPort=9051, log_level=10, password=None):
if not stem:
LOG.warn('please install the stem Python package')
return ''
try:
controller = oGetStemController(log_level=log_level, password=password)
map_dict = {"0.0.0.0": target}
map_ret = controller.map_address(map_dict)
return map_ret
except Exception as e:
LOG.exception(e)
return ''
def icheck_torrc(sFile, oArgs):
l = open(sFile, 'rt').readlines()
a = {}
for elt in l:
elt = elt.strip()
if not elt or not ' ' in elt: continue
k,v = elt.split(' ', 1)
if not elt or ' ' not in elt: continue
(k, v,) = elt.split(' ', 1)
a[k] = v
keys = a
@ -358,7 +550,7 @@ def icheck_torrc(sFile, oArgs):
print('VirtualAddrNetworkIPv4 172.16.0.0/12')
return 0
def lExitExcluder(oArgs, iPort=9051, log_level=10):
def lExitExcluder(oArgs, iPort=9051, log_level=10, password=None):
"""
https://raw.githubusercontent.com/nusenu/noContactInfo_Exit_Excluder/main/exclude_noContactInfo_Exits.py
"""
@ -368,7 +560,7 @@ def lExitExcluder(oArgs, iPort=9051, log_level=10):
LOG.debug('lExcludeExitNodes')
try:
controller = oGetStemController(log_level=log_level)
controller = oGetStemController(log_level=log_level, password=password)
# generator
relays = controller.get_server_descriptors()
except Exception as e:
@ -398,5 +590,5 @@ def lExitExcluder(oArgs, iPort=9051, log_level=10):
if __name__ == '__main__':
target = 'duckduckgogg42xjoc72x3sjasowoarfbgcmvfimaftt6twagswzczad'
controller = oGetStemController(log_level=10)
controller = oGetStemController(log_level=10, password=None)
lIntroductionPoints(controller, [target], itimeout=120)

View File

@ -0,0 +1,567 @@
#!/usr/bin/env python3
"""
Tor Contact Info Parser - A tool/Python Class for parsing Tor ContactInfo Information Sharing v2 specification contacts
Written by Eran Sandler (https://twitter.com/erans) (C) 2018
Turned into a proper command-line tool with sub-commands and flags by @Someguy123 at Privex Inc. (C) 2021
(https://www.privex.io) (https://github.com/PrivexInc)
This is a parser for the Tor ContactInfo Information Sharing Specification v2 (https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/).
The parser can parse the ContactInfo field of Tor relays based on the specification.
Official Repo: https://github.com/erans/torcontactinfoparser
Privex Fork: https://github.com/Privex/torcontactinfoparser
Released under the MIT License.
"""
import argparse
import os
import re
import sys
import json
import requests
import textwrap
try:
from rich import print as rprint
HAS_RICH = True
except ImportError:
def rprint(value='', *args, **kwargs):
if value not in [None, False, True] and isinstance(value, (dict, list, set, tuple)):
value = json.dumps(value, indent=4)
return print(value, *args, **kwargs)
# rprint = print
HAS_RICH = False
import logging
import warnings
warnings.filterwarnings('ignore')
from exclude_utils import vsetup_logging
class TorContactInfoParser(object):
email_regex = "^[a-zA-Z0-9.!#$%&*+/=?^_`{|}~-]+@[a-zA-Z0-9-]+(?:\\.[a-zA-Z0-9-]+)*$"
def _parse_string_value(self, value, min_length, max_length, valid_chars, raise_exception=False, field_name=None, deobfuscate_email=False):
value_length = len(value)
if value_length < min_length:
if raise_exception:
raise ValueError("value of field '{0}' is too short".format(field_name))
return None
if value_length > max_length:
if raise_exception:
raise ValueError("value of field '{0}' is too long".format(field_name))
return None
if valid_chars != "*":
m = re.search(valid_chars, value)
if not m:
if raise_exception:
raise ValueError("value of field '{0}' doesn't match valid chars restrictions".format(field_name))
else:
return None
return value
def _parse_email_value(self, value, field_name, raise_exception, deobfuscate_email):
if value:
v = value.replace("[]", "@")
if re.search(self.email_regex, v):
if not deobfuscate_email:
return v.replace("@", "[]")
return v
return None
_supported_fields_parsers = {
"email" : {
"fn": _parse_email_value,
"args": {}
},
"url" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 4,
"max_length" : 399,
"valid_chars" : "[_%/:a-zA-Z0-9.-]+"
}
},
"proof" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 7,
"max_length" : 7,
"valid_chars" : "[adinrsu-]+"
}
},
"ciissversion" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[12]+"
}
},
"pgp" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 40,
"max_length" : 40,
"valid_chars" : "[a-zA-Z0-9]+"
}
},
"abuse" : {
"fn": _parse_email_value,
"args": {}
},
"keybase" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 50,
"valid_chars" : "[a-zA-Z0-9]+"
}
},
"twitter" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 15,
"valid_chars" : "[a-zA-Z0-9_]+"
}
},
"mastodon" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 254,
"valid_chars" : "*"
}
},
"matrix" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 254,
"valid_chars" : "*"
}
},
"xmpp" : {
"fn": _parse_email_value,
"args": {}
},
"otr3" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 40,
"max_length" : 40,
"valid_chars" : "[a-z0-9]+"
}
},
"hoster" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 254,
"valid_chars" : "[a-zA-Z0-9.-]+"
}
},
"cost" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 13,
"valid_chars" : "[A-Z0-9.]+"
}
},
"uplinkbw" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 7,
"valid_chars" : "[0-9]+"
}
},
"trafficacct" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 9,
"valid_chars" : "[unmetrd0-9]+"
}
},
"memory" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 10,
"valid_chars" : "[0-9]+"
}
},
"cpu" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 50,
"valid_chars" : "[a-zA-Z0-9_-]+"
}
},
"virtualization" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 15,
"valid_chars" : "[a-z-]+"
}
},
"donationurl" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 254,
"valid_chars" : "*"
}
},
"btc" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 26,
"max_length" : 99,
"valid_chars" : "[a-zA-Z0-9]+"
}
},
"zec" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 95,
"valid_chars" : "[a-zA-Z0-9]+"
}
},
"xmr" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 99,
"valid_chars" : "[a-zA-Z0-9]+"
}
},
"offlinemasterkey" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
},
"signingkeylifetime" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 6,
"valid_chars" : "[0-9]+"
}
},
"sandbox" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 2,
"valid_chars" : "[yn]"
}
},
"os" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 20,
"valid_chars" : "[A-Za-z0-9/.]+"
}
},
"tls" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 0,
"max_length" : 14,
"valid_chars" : "[a-z]+"
}
},
"aesni" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
},
"autoupdate" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
},
"confmgmt" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 15,
"valid_chars" : "[a-zA-Z-]"
}
},
"dnslocation" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 5,
"max_length" : 100,
"valid_chars" : "[a-z,]"
}
},
"dnsqname" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
},
"dnssec" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
},
"dnslocalrootzone" : {
"fn" : _parse_string_value,
"args" : {
"min_length" : 1,
"max_length" : 1,
"valid_chars" : "[yn]"
}
}
}
def __init__(self):
pass
def parse(self, value: str, raise_exception_on_invalid_value=False, deobfuscate_email=True) -> dict:
# the ciissversion field is mandatory
if not 'ciissversion:' in value:
return None
result = {}
parts = value.split(" ")
for p in parts:
field_parts = p.split(":", 1)
if len(field_parts) <= 1:
continue
name, data = field_parts
if name in self._supported_fields_parsers:
field_parser = self._supported_fields_parsers[name]
if field_parser is None:
result[name] = data
continue
if callable(field_parser):
value = field_parser(self, data)
else:
field_parser["args"]["field_name"] = name
field_parser["args"]["value"] = data
field_parser["args"]["raise_exception"] = raise_exception_on_invalid_value
field_parser["args"]["deobfuscate_email"] = deobfuscate_email
value = field_parser["fn"](self, **field_parser["args"])
if not result.get(name, None):
result[name] = value
return result
def cmd_parse(opts: argparse.Namespace):
"""
ArgParser function for parsing a single ContactInfo string, and outputting it as JSON (or python-style dict's)
"""
if opts.contact is None or len(opts.contact) == 0 or opts.contact[0] == '-':
contact = sys.stdin.read()
else:
contact = ' '.join(opts.contact).strip()
tparser = TorContactInfoParser()
res = tparser.parse(contact)
if not opts.pretty:
return print(json.dumps(res))
if opts.json:
res = json.dumps(res, indent=4) if opts.pretty else json.dumps(res)
# if not HAS_RICH: res = json.dumps(res, indent=4)
rprint(res)
def cmd_scan(opts: argparse.Namespace, adata=None) -> int:
"""
ArgParser function for scanning all ContactInfo strings from ``https://onionoo.torproject.org/details`` ,
and outputting each one as a Python-style Dict, or JSON.
"""
parser = TorContactInfoParser()
surl = "https://onionoo.torproject.org/details"
if not adata:
LOG.info(f"Getting relays from {surl}")
jdata = requests.get(surl)
try:
adata = jdata.json()
except Exception as e:
# simplejson.errors.JSONDecodeError
LOG.exception(f"JSON error {e}")
return
elts = adata["relays"]
else:
elts = json.loads(adata)['relays']
if not elts:
LOG.warn(f"NO relays - are we connected?")
return
LOG.info(f"{len(elts)} relays")
for relay in elts:
if 'fingerprint' not in relay.keys():
LOG.warn(f"fingerprint not in relay for {relay}")
continue
fp = relay['fingerprint']
verified_host_names = relay.get('verified_host_names', [])
contact = relay.get("contact", None)
if not contact:
LOG.warn(f"No contact for {fp} {verified_host_names}")
continue
if 'ciissversion' not in contact:
LOG.debug(f"No ciissversion in contact in {fp}")
continue
LOG.debug(f"parsing {fp}")
result = parser.parse(contact, False)
if not result:
LOG.warn(f"No result for {contact} in {fp}")
continue
if len(result) > 0:
if opts.json: result = json.dumps(result, indent=4) if opts.pretty else json.dumps(result)
if opts.pretty:
rprint(result)
else:
print(result)
return 0
ETC_DIR = '/etc/tor/yaml'
def oparser():
cparser = argparse.ArgumentParser(
formatter_class=argparse.RawDescriptionHelpFormatter,
description=textwrap.dedent(f"""
Examples:
# 'scan' is the original behaviour of this script. It iterates over the data
# from https://onionoo.torproject.org/details , parses each contact, and prints it as Python dict-style JSON.
{sys.argv[0]} scan
# Same as previous. With no arguments, it's equivalent to running 'scan'.
{sys.argv[0]}
# If you pass '-p' after scan, it will enable pretty printing. For best pretty printing,
# make sure you have 'rich' installed from pypi.
{sys.argv[0]} scan -p
# If you need real JSON with double quotes, rather than Python dict-style JSON, you can
# use the '-j' flag to enable "real JSON" mode (you can combine with '-p' if you want pretty printed real json)
{sys.argv[0]} scan -j
# Using 'parse', you can parse an arbitrary ContactInfo string, and it will output the parsed result
# with pretty printing by default.
{sys.argv[0]} parse "contact Privex Inc. email:noc[]privex.io url:https://www.privex.io " \\
"proof:uri-rsa pgp:288DD1632F6E8951 keybase:privexinc twitter:PrivexInc hoster:www.privex.io " \\
"uplinkbw:500 memory:4096 virtualization:kvm btc:bc1qpst9uscvd8rpjjhzz9rau3trylh6e0wh76qrlhw3q9nj89ua728sn3t6a2 " \\
"xmr:89tukP3wfpH4FZAmC1D2GfArWwfPTz8Ap46NZc54Vyhy9YxEUYoFQ7HGQ74LrCMQTD3zxvwM1ewmGjH9WVmeffwR72m1Pps"
{{
'email': 'noc@privex.io',
'url': 'https://www.privex.io',
'proof': 'uri-rsa',
'pgp': None,
'keybase': 'privexinc',
'twitter': 'PrivexInc',
'hoster': 'www.privex.io',
'uplinkbw': '500',
'memory': '4096',
'virtualization': 'kvm',
'btc': 'bc1qpst9uscvd8rpjjhzz9rau3trylh6e0wh76qrlhw3q9nj89ua728sn3t6a2',
'xmr': '89tukP3wfpH4FZAmC1D2GfArWwfPTz8Ap46NZc54Vyhy9YxEUYoFQ7HGQ74LrCMQTD3zxvwM1ewmGjH9WVmeffwR72m1Pps'
}}
# You can also pipe a contact string into 'parse', and it will work just the same.
echo "Privex Inc. email:noc[]privex.io url:https://www.privex.io proof:uri-rsa pgp:288DD1632F6E8951 keybase:privexinc twitter:PrivexInc" | {sys.argv[0]} parse
{{'email': 'noc@privex.io', 'url': 'https://www.privex.io', 'proof': 'uri-rsa', 'pgp': None, 'keybase': 'privexinc', 'twitter': 'PrivexInc\n'}}
# If you need real JSON outputted, rather than Python dict-style output, you can pass -j to either 'parse' or 'scan'
{sys.argv[0]} parse -j "Privex Inc. email:noc[]privex.io url:https://www.privex.io proof:uri-rsa pgp:288DD1632F6E8951 keybase:privexinc twitter:PrivexInc"
{{
"email": "noc@privex.io",
"url": "https://www.privex.io",
"proof": "uri-rsa",
"pgp": null,
"keybase": "privexinc",
"twitter": "PrivexInc"
}}
# You can use '-np' to disable pretty printing for 'parse' - you can combine it with '-j' to get flat, plain JSON.
{sys.argv[0]} parse -np -j "Privex Inc. email:noc[]privex.io url:https://www.privex.io proof:uri-rsa pgp:288DD1632F6E8951 keybase:privexinc twitter:PrivexInc"
{{"email": "noc@privex.io", "url": "https://www.privex.io", "proof": "uri-rsa", "pgp": null, "keybase": "privexinc", "twitter": "PrivexInc"}}
"""))
cparser.set_defaults(func=cmd_scan, json=False, pretty=False)
subparse = cparser.add_subparsers()
subparse.required = False
sp_parse = subparse.add_parser('parse',
help="Parse a single contact string, either as an argument, or piped into stdin")
sp_parse.add_argument('contact', nargs='*')
sp_parse.add_argument('-np', '--no-pretty',
action='store_false', default=False, dest='pretty',
help="Disable pretty printing JSON")
sp_parse.add_argument('--relays_output', type=str,
dest='relays_output',
default=os.path.join(ETC_DIR, 'relays.json'),
help="Write the download relays in json to a file")
sp_parse.add_argument('-j', '--json', action='store_true',
default=False, dest='json',
help="Output real JSON, not Python dict format.")
sp_parse.set_defaults(func=cmd_parse)
sp_scan = subparse.add_parser('scan', help="Parse all contacts from https://onionoo.torproject.org/details")
sp_scan.add_argument('-p', action='store_true', default=False, dest='pretty', help="Enable pretty printing JSON")
sp_scan.add_argument('-j', '--json', action='store_true', default=False, dest='json', help="Output real JSON, not Python dict format.")
# sp_scan.set_defaults(func=cmd_scan)
return cparser
if __name__ == "__main__":
if os.environ.get('DEBUG', ''):
log_level = 10
else:
log_level = 20
LOG = logging.getLogger()
vsetup_logging(LOG, log_level)
try:
cparser = oparser()
opts = cparser.parse_args(sys.argv[1:])
data = None
if opts.relays_output and os.path.exists(opts.relays_output):
data = open(opts.relays_output, 'rt').read()
i = cmd_scan(opts, data)
except KeyboardInterrupt as e:
i = 0
except (requests.exceptions.ProxyError, Exception,) as e:
LOG.error(f"{e}")
i = 0
sys.exit(i)

View File

@ -3,31 +3,38 @@
# from https://github.com/nusenu/trustor-poc
# with minor refactoring to make the code more Pythonic.
import os
import sys
import datetime
import os
import re
import sys
import ipaddress
import warnings
import requests
from stem.control import Controller
from stem.util.tor_tools import *
# from urllib.parse import urlparse
import urllib3.util
from urllib3.util import parse_url as urlparse
from stem.control import Controller
# from stem.util.tor_tools import *
try:
# unbound is not on pypi
from unbound import ub_ctx,RR_TYPE_TXT,RR_CLASS_IN
from unbound import RR_CLASS_IN, RR_TYPE_TXT, ub_ctx
except:
ub_ctx = RR_TYPE_TXT = RR_CLASS_IN = None
global LOG
import logging
import warnings
warnings.filterwarnings('ignore')
LOG = logging.getLogger()
logging.getLogger("urllib3").setLevel(logging.INFO)
# import urllib3.contrib.pyopenssl
# urllib3.contrib.pyopenssl.inject_into_urllib3()
# download this python library from
# https://github.com/erans/torcontactinfoparser
#sys.path.append('/home/....')
# sys.path.append('/home/....')
try:
from torcontactinfo import TorContactInfoParser
except:
@ -42,7 +49,7 @@ def is_valid_hostname(hostname):
return False
if hostname[-1] == ".":
hostname = hostname[:-1] # strip exactly one dot from the right, if present
allowed = re.compile("(?!-)[A-Z\d-]{1,63}(?<!-)$", re.IGNORECASE)
allowed = re.compile("(?!-)[A-Z0-9-]{1,63}(?<!-)$", re.IGNORECASE)
return all(allowed.match(x) for x in hostname.split("."))
def read_local_trust_config(trust_config):
@ -127,7 +134,7 @@ def get_controller(address='127.0.0.1', port=9151, password=''):
'''
try:
#controller = Controller.from_socket_file(path=torsocketpath)
# controller = Controller.from_socket_file(path=torsocketpath)
controller = Controller.from_port(address=address, port=port)
controller.authenticate(password=password)
except Exception as e:
@ -155,7 +162,7 @@ def find_validation_candidates(controller,
{ 'emeraldonion.org' : { 'uri-rsa': ['044600FD968728A6F220D5347AD897F421B757C0', '09DCA3360179C6C8A5A20DDDE1C54662965EF1BA']}}
'''
# https://github.com/nusenu/ContactInfo-Information-Sharing-Specification#proof
accepted_proof_types = ['uri-rsa','dns-rsa']
accepted_proof_types = ['uri-rsa', 'dns-rsa']
# https://github.com/nusenu/ContactInfo-Information-Sharing-Specification#ciissversion
accepted_ciissversions = ['2']
@ -186,15 +193,15 @@ def find_validation_candidates(controller,
if parsed_ci['ciissversion'] in accepted_ciissversions and prooftype in accepted_proof_types:
if ciurl.startswith('http://') or ciurl.startswith('https://'):
try:
domain=urlparse(ciurl).netloc
domain = urlparse(ciurl).netloc
except:
LOG.warning('failed to parse domain %s' % ciurl)
domain='error'
domain = 'error'
continue
else:
domain=ciurl
domain = ciurl
if not is_valid_hostname(domain):
domain='error'
domain = 'error'
continue
# we can ignore relays that do not claim to be operated by a trusted operator
# if we do not accept all
@ -204,19 +211,20 @@ def find_validation_candidates(controller,
if prooftype in result[domain].keys():
result[domain][prooftype].append(fingerprint)
else:
result[domain] = { prooftype : [fingerprint] }
result[domain] = {prooftype: [fingerprint]}
# mixed proof types are not allowd as per spec but we are not strict here
LOG.warning('%s is using mixed prooftypes %s' % (domain, prooftype))
else:
result[domain] = {prooftype : [fingerprint]}
result[domain] = {prooftype: [fingerprint]}
return result
def oDownloadUrlRequests(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
def oDownloadUrlRequests(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050, content_type='text/plain', session=None):
import requests
# socks proxy used for outbound web requests (for validation of proofs)
proxy = {'https': 'socks5h://' +host +':' +str(port)}
proxy = {'https': "socks5h://{host}:{port}"}
# we use this UA string when connecting to webservers to fetch rsa-fingerprint.txt proof files
# https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/#uri-rsa
headers = {'User-Agent':'Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0'}
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0'}
LOG.debug("fetching %s...." % uri)
try:
@ -224,6 +232,7 @@ def oDownloadUrlRequests(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
# urllib3.connection WARNING Certificate did not match expected hostname:
head = requests.head(uri, timeout=timeout, proxies=proxy, headers=headers)
except Exception as e:
LOG.exception(f"{e}")
raise TrustorError(f"HTTP HEAD request failed for {uri} {e}")
if head.status_code >= 300:
@ -233,15 +242,15 @@ def oDownloadUrlRequests(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
if not os.path.exists(sCAfile):
raise TrustorError(f"File not found CAfile {sCAfile}")
if session is None: session = requests.sessions.Session()
try:
with requests.sessions.Session() as session:
oReqResp = session.request(method="get", url=uri,
proxies=proxy,
timeout=timeout,
headers=headers,
allow_redirects=False,
verify=True
)
oReqResp = session.request(method="get", url=uri,
proxies=proxy,
timeout=timeout,
headers=headers,
allow_redirects=False,
verify=True
)
except:
LOG.warn("HTTP GET request failed for %s" % uri)
raise
@ -250,31 +259,79 @@ def oDownloadUrlRequests(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
if not oReqResp.headers['Content-Type'].startswith('text/plain'):
raise TrustorError(f"HTTP Content-Type != text/plain")
#check for redirects (not allowed as per spec)
# check for redirects (not allowed as per spec)
if oReqResp.url != uri:
LOG.error(f'Redirect detected %s vs %s (final)' % (uri, oReqResp.url))
raise TrustorError(f'Redirect detected %s vs %s (final)' % (uri, oReqResp.url))
LOG.error(f'Redirect detected {uri} vs %s (final)' % (oReqResp.url))
raise TrustorError(f'Redirect detected {uri} vs %s (final)' % (oReqResp.url))
return oReqResp
# There's no point in using asyncio because of duplicate urls in the tasks
async def oDownloadUrlHttpx(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050, content_type='text/plain'):
import httpcore
import asyncio
import httpx
# socks proxy used for outbound web requests (for validation of proofs)
if host and port:
proxy = "socks5://{host}:{port}"
else:
proxy = ''
# we use this UA string when connecting to webservers to fetch rsa-fingerprint.txt proof files
# https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/#uri-rsa
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0'}
LOG.debug("fetching %s...." % uri)
async with httpx.AsyncClient(proxies=proxy) as client:
try:
# https://www.python-httpx.org/advanced/
head = await client.head(uri, timeout=timeout, headers=headers)
except Exception as e:
LOG.exception(f"{e}")
raise TrustorError(f"HTTP HEAD request failed for {uri} {e}")
if head.status_code >= 300:
raise TrustorError(f"HTTP Errorcode {head.status_code}")
if content_type and not head.headers['Content-Type'].startswith(content_type):
raise TrustorError(f"HTTP Content-Type != {content_type}" )
if not os.path.exists(sCAfile):
raise TrustorError(f"File not found CAfile {sCAfile}")
try:
oReqResp = await client.get(url=uri,
timeout=timeout,
headers=headers,
max_redirects=0,
verify=sCAfile,
)
except (asyncio.exceptions.CancelledError,
httpcore.PoolTimeout,
Exception,) as e:
LOG.warn(f"HTTP GET request failed for %s {e}" % uri)
raise
if oReqResp.status_code != 200:
LOG.warn(f"HTTP Errorcode {head.status_code}")
raise TrustorError(f"HTTP Errorcode {head.status_code}")
if not oReqResp.headers['Content-Type'].startswith('text/plain'):
LOG.warn(f"HTTP Content-Type != text/plain")
raise TrustorError(f"HTTP Content-Type != text/plain")
# check for redirects (not allowed as per spec)
if oReqResp.url != uri:
LOG.error(f'Redirect detected {uri} vs %s (final)' % (oReqResp.url))
raise TrustorError(f'Redirect detected {uri} vs %s (final)' % (oReqResp.url))
return oReqResp
logging.getLogger("urllib3").setLevel(logging.INFO)
#import urllib3.contrib.pyopenssl
#urllib3.contrib.pyopenssl.inject_into_urllib3()
import urllib3.util
import ipaddress
def ballow_subdomain_matching(hostname, dnsnames):
for elt in dnsnames:
if len(hostname.split('.')) > len(elt.split('.')) and \
hostname.endswith(elt):
if len(hostname.split('.')) > len(elt.split('.')) and hostname.endswith(elt):
# parent
return True
return False
from urllib3.util.ssl_match_hostname import (CertificateError,
match_hostname,
_dnsname_match,
_ipaddress_match,
)
from urllib3.util.ssl_match_hostname import (CertificateError, _dnsname_match,
_ipaddress_match)
def my_match_hostname(cert, hostname):
"""Verify that *cert* (in decoded format as returned by
SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125
@ -341,10 +398,10 @@ def my_match_hostname(cert, hostname):
raise CertificateError(
"no appropriate commonName or subjectAltName fields were found"
)
match_hostname = my_match_hostname
from urllib3.util.ssl_ import (
is_ipaddress,
)
urllib3.util.ssl_match_hostname.match_hostname = my_match_hostname
from urllib3.util.ssl_ import is_ipaddress
def _my_match_hostname(cert, asserted_hostname):
# Our upstream implementation of ssl.match_hostname()
# only applies this normalization to IP addresses so it doesn't
@ -364,12 +421,18 @@ def _my_match_hostname(cert, asserted_hostname):
# the cert when catching the exception, if they want to
e._peer_cert = cert
raise
from urllib3.connection import _match_hostname, HTTPSConnection
urllib3.connection._match_hostname = _my_match_hostname
from urllib3.contrib.socks import SOCKSProxyManager
from urllib3 import Retry
def oDownloadUrlUrllib3(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
# from urllib3 import Retry
def oDownloadUrlUrllib3Socks(uri,
sCAfile,
timeout=30,
host='127.0.0.1',
port=9050,
session=None,
content_type='text/plain'):
"""Theres no need to use requests here and it
adds too many layers on the SSL to be able to get at things
"""
@ -384,7 +447,7 @@ def oDownloadUrlUrllib3(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
# we use this UA string when connecting to webservers to fetch rsa-fingerprint.txt proof files
# https://nusenu.github.io/ContactInfo-Information-Sharing-Specification/#uri-rsa
headers = {'User-Agent':'Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0'}
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0'}
LOG.debug("fetching %s...." % uri)
try:
@ -401,8 +464,8 @@ def oDownloadUrlUrllib3(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
if head.status >= 300:
raise TrustorError(f"HTTP Errorcode {head.status}")
if not head.headers['Content-Type'].startswith('text/plain'):
raise TrustorError(f"HTTP Content-Type != text/plain")
if content_type and not head.headers['Content-Type'].startswith(content_type):
raise TrustorError(f"HTTP Content-Type != {content_type}")
if not os.path.exists(sCAfile):
raise TrustorError(f"File not found CAfile {sCAfile}")
@ -416,21 +479,24 @@ def oDownloadUrlUrllib3(uri, sCAfile, timeout=30, host='127.0.0.1', port=9050):
raise
if oReqResp.status != 200:
raise TrustorError(f"HTTP Errorcode {head.status}")
if not oReqResp.headers['Content-Type'].startswith('text/plain'):
raise TrustorError(f"HTTP Content-Type != text/plain")
if content_type and not oReqResp.headers['Content-Type'].startswith(content_type):
raise TrustorError(f"HTTP Content-Type != {content_type}")
#check for redirects (not allowed as per spec)
# check for redirects (not allowed as per spec)
if oReqResp.geturl() != uri:
LOG.error(f'Redirect detected %s vs %s (final)' % (uri, oReqResp.geturl()))
raise TrustorError(f'Redirect detected %s vs %s (final)' % (uri, oReqResp.geturl()))
oReqResp.decode_content = True
return oReqResp
import urllib3.connectionpool
from urllib3.connection import HTTPSConnection
urllib3.connectionpool.VerifiedHTTPSConnection = HTTPSConnection
def lDownloadUrlFps(domain, sCAfile, timeout=30, host='127.0.0.1', port=9050):
uri="https://"+domain+"/.well-known/tor-relay/rsa-fingerprint.txt"
uri = f"https://{domain}/.well-known/tor-relay/rsa-fingerprint.txt"
o = oDownloadUrlRequests(uri, sCAfile, timeout=timeout, host=host, port=port)
well_known_content = o.text.upper().strip().split('\n')
well_known_content = [i for i in well_known_content if i and len(i) == 40]
@ -450,7 +516,7 @@ def validate_proofs(candidates, validation_cache_file, timeout=20, host='127.0.0
for domain in candidates.keys():
for prooftype in candidates[domain].keys():
if prooftype == 'uri-rsa':
well_known_content = lDownloadUrlFps(domain, timeout=timeout, host=host, port=port)
well_known_content = lDownloadUrlFps(domain, sCAfile, timeout=timeout, host=host, port=port)
for fingerprint in candidates[domain][prooftype]:
if fingerprint in well_known_content:
# write cache entry
@ -460,7 +526,7 @@ def validate_proofs(candidates, validation_cache_file, timeout=20, host='127.0.0
LOG.error('%s:%s:%s' % (fingerprint, domain, prooftype))
elif prooftype == 'dns-rsa' and ub_ctx:
for fingerprint in candidates[domain][prooftype]:
fp_domain = fingerprint+'.'+domain
fp_domain = fingerprint + '.' + domain
if idns_validate(fp_domain,
libunbound_resolv_file='resolv.conf',
dnssec_DS_file='dnssec-root-trust',
@ -488,7 +554,6 @@ def idns_validate(domain,
# this is not the system wide /etc/resolv.conf
# use dnscrypt-proxy to encrypt your DNS and route it via tor's SOCKSPort
ctx = ub_ctx()
if (os.path.isfile(libunbound_resolv_file)):
ctx.resolvconf(libunbound_resolv_file)
@ -526,12 +591,10 @@ def configure_tor(controller, trusted_fingerprints, exitonly=True):
try:
controller.set_conf('ExitNodes', trusted_fingerprints)
LOG.error('limited exits to %s relays' % relay_count)
except Exception as e:
except Exception as e: # noqa
LOG.exception('Failed to set ExitNodes tor config to trusted relays')
sys.exit(20)
if __name__ == '__main__':
CAfile = '/etc/ssl/certs/ca-certificates.crt'
trust_config = 'trust_config'
@ -542,12 +605,12 @@ if __name__ == '__main__':
trusted_fingerprints = read_local_validation_cache(validation_cache_file,
trusted_domains=trusted_domains)
# tor ControlPort password
controller_password=''
controller_password = ''
# tor ControlPort IP
controller_address = '127.0.0.1'
timeout = 20
port = 9050
controller = get_controller(address=controller_address,password=controller_password)
controller = get_controller(address=controller_address, password=controller_password)
r = find_validation_candidates(controller,
validation_cache=trusted_fingerprints,