dump
This commit is contained in:
commit
514413db94
|
@ -0,0 +1,3 @@
|
||||||
|
*.pyc
|
||||||
|
*.sqlite
|
||||||
|
data
|
|
@ -0,0 +1,32 @@
|
||||||
|
install:
|
||||||
|
apt install postgresql postgresql-contrib nginx
|
||||||
|
pip install chopy uwsgi sqlalchemy psycopg2
|
||||||
|
su postgres -c "psql postgres -c \"ALTER USER postgres WITH ENCRYPTED PASSWORD 'slightly-secure-passphrase'\""
|
||||||
|
-su postgres -c "psql postgres -c \"CREATE DATABASE chopy\""
|
||||||
|
sed "s#xyzzy#$(shell pwd)#g" conf/nginx.conf > /etc/nginx/sites-available/niku
|
||||||
|
ln -fs /etc/nginx/sites-available/niku /etc/nginx/sites-enabled/niku
|
||||||
|
rm -f /etc/nginx/sites-enabled/default
|
||||||
|
service nginx reload
|
||||||
|
|
||||||
|
uninstall:
|
||||||
|
rm -f /etc/nginx/sites-*/niku
|
||||||
|
ln -fs /etc/nginx/sites-available/default /etc/nginx/sites-enabled/default
|
||||||
|
service nginx reload
|
||||||
|
su postgres -c "psql postgres -c \"DROP DATABASE chopy\""
|
||||||
|
|
||||||
|
launch:
|
||||||
|
python launch-chopy.py
|
||||||
|
/usr/local/bin/uwsgi --ini conf/uwsgi.conf
|
||||||
|
|
||||||
|
debug:
|
||||||
|
python launch-chopy.py
|
||||||
|
/usr/local/bin/uwsgi --ini conf/uwsgi-debug.conf
|
||||||
|
|
||||||
|
stop:
|
||||||
|
-pgrep -f 'python -m chopy' | xargs kill
|
||||||
|
-killall -INT uwsgi
|
||||||
|
|
||||||
|
clean: stop
|
||||||
|
rm -rf data
|
||||||
|
-su postgres -c "psql postgres -c \"DROP DATABASE chopy\""
|
||||||
|
-su postgres -c "psql postgres -c \"CREATE DATABASE chopy\""
|
|
@ -0,0 +1,62 @@
|
||||||
|
# niku-server
|
||||||
|
|
||||||
|
Don't forget to install chopy!!!!
|
||||||
|
As long as it imports from the same shell you run `make launch`, you're good.
|
||||||
|
|
||||||
|
## Makefile commands
|
||||||
|
|
||||||
|
- `sudo make install`: Install prerequisites, configure nginx/postgres
|
||||||
|
- `sudo make unistall`: Remove nginx/postgres configuration
|
||||||
|
- `sudo make clean`: Reset the environment and database to postinstall
|
||||||
|
- `make launch`: Launch an instance with production parameters
|
||||||
|
- `make debug`: Launch an instance with debug parameters, listening on localhost:8080
|
||||||
|
- `make stop`: Halt the components started by `launch` or `debug`
|
||||||
|
|
||||||
|
## System architecture
|
||||||
|
|
||||||
|
### nignx
|
||||||
|
|
||||||
|
Very simple nginx configuration, in `conf/nginx.conf`.
|
||||||
|
Basically just serves static pcaps and forwards everything to uwsgi.
|
||||||
|
|
||||||
|
### uwsgi
|
||||||
|
|
||||||
|
`make launch` will start an instance of uwsgi, serving `app.py`.
|
||||||
|
It will use the configuration parameters from `config.py`, and log to `data/log/uwsgi.log`.
|
||||||
|
|
||||||
|
This is basically just an API wrapper around the chopy database.
|
||||||
|
|
||||||
|
### chopy
|
||||||
|
|
||||||
|
`make launch` will run `launch-chopy.py` to load the configuration from `config.py` and launch a chopy instance, by default logging to `data/log/chopy.log`.
|
||||||
|
All the folders expected by chopy will be put in a `data` folder.
|
||||||
|
By default you should dump pcaps into `data/pcap_dump`, and they will be sorted into the database and `data/pcap_split`.
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
Most of the interfaces take their parameters as a json encoded object, passed in the query string, for example `GET /api/search?{}`.
|
||||||
|
I'm very sorry for this.
|
||||||
|
|
||||||
|
- `GET /api/search` - search the database and index.
|
||||||
|
Provide a dictionary of parameters that are the keyword arguments to `chopy.search.search`.
|
||||||
|
Returns the matching IDs, one per line.
|
||||||
|
- `GET /api/metadata` - retrieve stream metadata.
|
||||||
|
Provide a list of ids for which to retrieve the metadata.
|
||||||
|
Returns the metadata as a series of json-encoded dictionaries, one per line. No guarantee is made about the order of the returned values, check the `id` of each.
|
||||||
|
The metadata is in the same form as the chopy database, but as dictionaries insead of relations.
|
||||||
|
- `GET /pcap/<path>` - replace `<path>` with the `filename` attribute from a stream's metadata to download its individual pcap.
|
||||||
|
|
||||||
|
### Tags, Services, and Hosts
|
||||||
|
|
||||||
|
Tags, services, and hosts use a similar API to get/set/delete data.
|
||||||
|
|
||||||
|
- `GET /api/<kind>/get` - Retrive all the known resources of the given kind
|
||||||
|
Returns each resource as a separate json-encoded dictionary, one per line.
|
||||||
|
- `GET /api/<kind>/set` - Create or update the given resource.
|
||||||
|
Provide as a dictionary all the identifier and data arguments for the resource.
|
||||||
|
- `GET /api/<kind>/del` - Delete a given resource.
|
||||||
|
Provide as a dictionary all the identifier arguments for the resource.
|
||||||
|
|
||||||
|
- For kind `tag`, use identifier arguments `connection` and `text`. There are no data arguments.
|
||||||
|
- For kind `service` use identifier arguments `protocol`, `host`, and `port`, and `name` as a data argument.
|
||||||
|
- For kind `host`, use `boot_time` as an identifier argument and `name` as a data argument.
|
|
@ -0,0 +1,188 @@
|
||||||
|
import chopy.search
|
||||||
|
import chopy.db
|
||||||
|
import traceback
|
||||||
|
import config
|
||||||
|
import json
|
||||||
|
import urllib
|
||||||
|
import os
|
||||||
|
|
||||||
|
def application(environ, start_response):
|
||||||
|
path = environ['PATH_INFO']
|
||||||
|
qs = environ['QUERY_STRING']
|
||||||
|
|
||||||
|
try:
|
||||||
|
if path == '/':
|
||||||
|
start_response('200 OK', [('Content-type', 'text/plain')])
|
||||||
|
yield 'BEEP BOOP WELCOME TO THE API'
|
||||||
|
|
||||||
|
elif path == '/api/search':
|
||||||
|
start_response('200 OK', [('Content-type', 'text/plain')])
|
||||||
|
for cid in do_search(environ['QUERY_STRING']):
|
||||||
|
yield '%d\r\n' % cid
|
||||||
|
|
||||||
|
elif path == '/api/sort':
|
||||||
|
start_response('200 OK', [('Content-type', 'text/plain')])
|
||||||
|
for cid in do_sort(environ['QUERY_STRING']):
|
||||||
|
yield '%d\r\n' % cid
|
||||||
|
|
||||||
|
elif path == '/api/metadata':
|
||||||
|
start_response('200 OK', [('Content-type', 'text/plain')])
|
||||||
|
for conn in do_lookup(environ['QUERY_STRING']):
|
||||||
|
yield '%s\r\n' % json.dumps(conn)
|
||||||
|
|
||||||
|
elif path in extra_paths:
|
||||||
|
start_response('200 OK', [('Content-type', 'text/plain')])
|
||||||
|
for x in extra_paths[path](qs):
|
||||||
|
yield x
|
||||||
|
|
||||||
|
elif path.startswith('/pcap'):
|
||||||
|
start_response('200 OK', [('Content-type', 'application/x-octet-stream')])
|
||||||
|
pathkeys = path.split('/')
|
||||||
|
pathkeys[0:2] = [config.split_dir]
|
||||||
|
filepath = os.path.join(*pathkeys)
|
||||||
|
yield open(filepath).read()
|
||||||
|
|
||||||
|
else:
|
||||||
|
start_response('404 NOT FOUND', [('Content-type', 'text/plain')])
|
||||||
|
yield 'The URL you requested could not be serviced by the application.\r\n'
|
||||||
|
except: # pylint: disable=bare-except
|
||||||
|
tb = traceback.format_exc()
|
||||||
|
start_response('500 INTERNAL SERVER ERROR', [('Content-type', 'text/plain')])
|
||||||
|
yield 'The server encountered an error during execution. Here is some debug information.\r\n\r\n'
|
||||||
|
yield 'Environment:\r\n'
|
||||||
|
for key in environ:
|
||||||
|
yield ' %s=%s\r\n' % (key, environ[key])
|
||||||
|
yield '\r\n'
|
||||||
|
yield tb.replace('\n', '\r\n')
|
||||||
|
print tb
|
||||||
|
|
||||||
|
def nytes_to_bit_string(n):
|
||||||
|
bin_str = "".join(bin(ord(c))[2:].zfill(8) for c in n)
|
||||||
|
#num_bits = (len(n) * 8) % 9
|
||||||
|
#return bin_str[:len(bin_str) - num_bits]
|
||||||
|
return bin_str
|
||||||
|
|
||||||
|
def bytes_to_nytes(b):
|
||||||
|
bin_str = "".join(bin(ord(c))[2:].zfill(9) for c in b)
|
||||||
|
bin_str = bin_str.zfill(len(bin_str) + ((8 - (len(bin_str) % 8)) % 8))
|
||||||
|
return "".join(chr(int(bin_str[i:i+8], 2)) for i in xrange(0, len(bin_str), 8))
|
||||||
|
|
||||||
|
def do_search(query_string):
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
data = json.loads(urllib.unquote(query_string))
|
||||||
|
if 'search_regex' in data and data['search_regex']:
|
||||||
|
# OH BOY
|
||||||
|
bstr = str(data['search_regex'])
|
||||||
|
bitstr = ''.join(bin(ord(c))[2:].zfill(9) for c in bstr)
|
||||||
|
possible_values = []
|
||||||
|
for i in xrange(9):
|
||||||
|
trunc_bitstr = bitstr[i:]
|
||||||
|
possible_values.append(''.join(chr(int(trunc_bitstr[j:j+8], 2)) for j in xrange(0, len(trunc_bitstr)-7, 8)))
|
||||||
|
|
||||||
|
data['search_regex'] = '|'.join(''.join('\\x%02x' % ord(c) for c in x) for x in possible_values)
|
||||||
|
|
||||||
|
if type(data) is not dict:
|
||||||
|
raise ValueError("Query string for /api/search must be a json dictionary")
|
||||||
|
data['index_dir'] = config.index_dir
|
||||||
|
for cid in chopy.search.search(session, **data):
|
||||||
|
yield cid
|
||||||
|
|
||||||
|
def do_sort(query_string):
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
data = json.loads(urllib.unquote(query_string))
|
||||||
|
if type(data) is not dict:
|
||||||
|
raise ValueError("Query string for /api/sort must be a json dictionary")
|
||||||
|
cids = data['ids']
|
||||||
|
key = data['order_by']
|
||||||
|
|
||||||
|
q = session.query(chopy.db.Connection.id) \
|
||||||
|
.filter(chopy.db.Connection.id.in_(cids)) \
|
||||||
|
.order_by(getattr(chopy.db.Connection, key))
|
||||||
|
for conn in q:
|
||||||
|
yield str(conn.id)
|
||||||
|
|
||||||
|
def do_lookup(query_string):
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
data = json.loads(urllib.unquote(query_string))
|
||||||
|
if type(data) is not list:
|
||||||
|
raise ValueError("Query string for /api/metadata must be a json list")
|
||||||
|
q = session.query(chopy.db.Connection) \
|
||||||
|
.filter(chopy.db.Connection.id.in_(data)) \
|
||||||
|
.options(chopy.db.joinedload('tags'))
|
||||||
|
for conn in q:
|
||||||
|
out = conn.dict()
|
||||||
|
out['tags'] = [x.text for x in out['tags']]
|
||||||
|
#out['tags'] = conn.tags
|
||||||
|
yield out
|
||||||
|
|
||||||
|
# this is... the least readable code I've ever written
|
||||||
|
|
||||||
|
def make_setter(cls, args, updatable_args):
|
||||||
|
def set_something(query_string):
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
data = json.loads(urllib.unquote(query_string))
|
||||||
|
if type(data) is not dict:
|
||||||
|
raise ValueError("Query string for set API must be a json dictionary")
|
||||||
|
if set(args) != set(data):
|
||||||
|
raise ValueError("Expected arguments: %r" % args)
|
||||||
|
|
||||||
|
obj = None
|
||||||
|
if updatable_args:
|
||||||
|
q = session.query(cls)
|
||||||
|
for arg in args:
|
||||||
|
if arg not in updatable_args:
|
||||||
|
q = q.filter(getattr(cls, arg) == data[arg])
|
||||||
|
obj = q.first()
|
||||||
|
if obj is not None:
|
||||||
|
for arg in updatable_args:
|
||||||
|
setattr(obj, arg, data[arg])
|
||||||
|
else:
|
||||||
|
obj = cls(**data)
|
||||||
|
session.add(obj)
|
||||||
|
|
||||||
|
session.commit()
|
||||||
|
yield str(data)
|
||||||
|
return set_something
|
||||||
|
|
||||||
|
def make_getter(cls, result_args):
|
||||||
|
def get_something(query_string): # pylint: disable=unused-argument
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
q = session.query(*[getattr(cls, arg) for arg in result_args]).distinct()
|
||||||
|
for r in q:
|
||||||
|
yield '%s\r\n' % json.dumps({arg: getattr(r, arg) for arg in result_args})
|
||||||
|
return get_something
|
||||||
|
|
||||||
|
def make_deleter(cls, args):
|
||||||
|
def del_something(query_string):
|
||||||
|
_, session = chopy.db.connect(config.database)
|
||||||
|
data = json.loads(urllib.unquote(query_string))
|
||||||
|
if type(data) is not dict:
|
||||||
|
raise ValueError("Query string for delete API must be a json dictionary")
|
||||||
|
if set(args) != set(data):
|
||||||
|
raise ValueError("Expected arguments: %r" % args)
|
||||||
|
q = session.query(cls)
|
||||||
|
for arg in args:
|
||||||
|
q = q.filter(getattr(cls, arg) == data[arg])
|
||||||
|
q.delete()
|
||||||
|
session.commit()
|
||||||
|
yield '%s\r\n' % data
|
||||||
|
return del_something
|
||||||
|
|
||||||
|
|
||||||
|
tags_setter = make_setter(chopy.db.Tag, ['connection', 'text'], [])
|
||||||
|
tags_getter = make_getter(chopy.db.Tag, ['text'])
|
||||||
|
tags_deleter = make_deleter(chopy.db.Tag, ['connection', 'text'])
|
||||||
|
|
||||||
|
services_setter = make_setter(chopy.db.ServiceName, ['protocol', 'host', 'port', 'name'], ['name'])
|
||||||
|
services_getter = make_getter(chopy.db.ServiceName, ['protocol', 'host', 'port', 'name'])
|
||||||
|
services_deleter = make_deleter(chopy.db.ServiceName, ['protocol', 'host', 'port', 'name'])
|
||||||
|
|
||||||
|
hosts_setter = make_setter(chopy.db.HostName, ['boot_time', 'name'], ['name'])
|
||||||
|
hosts_getter = make_getter(chopy.db.HostName, ['boot_time', 'name'])
|
||||||
|
hosts_deleter = make_deleter(chopy.db.HostName, ['boot_time', 'name'])
|
||||||
|
|
||||||
|
extra_paths = {
|
||||||
|
'/api/tags/get': tags_getter, '/api/tags/set': tags_setter, '/api/tags/del': tags_deleter,
|
||||||
|
'/api/services/get': services_getter, '/api/services/set': services_setter, '/api/services/del': services_deleter,
|
||||||
|
'/api/hosts/get': hosts_getter, '/api/hosts/set': hosts_setter, '/api/hosts/del': hosts_deleter
|
||||||
|
}
|
|
@ -0,0 +1,17 @@
|
||||||
|
server {
|
||||||
|
listen *:80;
|
||||||
|
root xyzzy;
|
||||||
|
|
||||||
|
location /pcap {
|
||||||
|
alias xyzzy/data/pcap_split;
|
||||||
|
}
|
||||||
|
|
||||||
|
location /api {
|
||||||
|
include uwsgi_params;
|
||||||
|
uwsgi_pass unix:///tmp/uwsgi-niku.sock;
|
||||||
|
}
|
||||||
|
|
||||||
|
location =/index.html {
|
||||||
|
alias xyzzy/index.html;
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,5 @@
|
||||||
|
[uwsgi]
|
||||||
|
http = localhost:8080
|
||||||
|
daemonize = data/log/uwsgi.log
|
||||||
|
wsgi-file = app.py
|
||||||
|
fs-reload = app.py
|
|
@ -0,0 +1,5 @@
|
||||||
|
[uwsgi]
|
||||||
|
socket = /tmp/uwsgi-niku.sock
|
||||||
|
daemonize = data/log/uwsgi.log
|
||||||
|
wsgi-file = app.py
|
||||||
|
processes = 4
|
|
@ -0,0 +1,12 @@
|
||||||
|
base_dir = 'data'
|
||||||
|
index_dir = 'data/index'
|
||||||
|
split_dir = 'data/pcap_split'
|
||||||
|
monitor_dir = 'data/pcap_dump'
|
||||||
|
filter_dir = 'data/filters'
|
||||||
|
log_dir = 'data/log'
|
||||||
|
flag_dir = 'data/flags'
|
||||||
|
all_dirs = [base_dir, index_dir, split_dir, filter_dir, monitor_dir, log_dir, flag_dir]
|
||||||
|
|
||||||
|
database = 'postgres://postgres:slightly-secure-passphrase@localhost:5432/chopy'
|
||||||
|
#database = 'sqlite:///data/sqlite.db'
|
||||||
|
logfile = 'data/log/chopy.log'
|
|
@ -0,0 +1,35 @@
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>Niku</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<h1>
|
||||||
|
Welcome to niku - almost
|
||||||
|
</h1>
|
||||||
|
|
||||||
|
<p>
|
||||||
|
This year, our IDS is a <em>client-based web server</em>.
|
||||||
|
In order to use it, follow the following steps:
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<ol>
|
||||||
|
<li>Install the prerequisites: <pre>pip install flask flask_wtf</pre></li>
|
||||||
|
<li>Download the client: <pre>git clone git@git.seclab.cs.ucsb.edu:sherlock/niku-client.git && cd niku-client</pre></li>
|
||||||
|
<li>Run the local server: <pre>python run.py</pre></li>
|
||||||
|
<li>Navigate to <a href="http://localhost:5000">http://localhost:5000</a> and start your analysis!</li>
|
||||||
|
</ol>
|
||||||
|
|
||||||
|
<p>
|
||||||
|
If at any point an update is released, you may upgrade to the latest version of the client by simply pulling the git repository.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p>
|
||||||
|
If at any point you run into weird issues, your first debugging step should be to remove <pre>~/.niku</pre>.
|
||||||
|
This will clear your search history and local cache.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p>
|
||||||
|
<strong>Please be gentle with the server!</strong>
|
||||||
|
</p>
|
||||||
|
</body>
|
||||||
|
</html>
|
|
@ -0,0 +1,22 @@
|
||||||
|
import os
|
||||||
|
from config import * # pylint: disable=wildcard-import,unused-wildcard-import
|
||||||
|
|
||||||
|
def launch():
|
||||||
|
for dirname in all_dirs:
|
||||||
|
if not os.path.isdir(dirname):
|
||||||
|
os.mkdir(dirname)
|
||||||
|
|
||||||
|
if os.fork() == 0:
|
||||||
|
f = open(logfile, 'a')
|
||||||
|
os.close(0)
|
||||||
|
os.dup2(f.fileno(), 1)
|
||||||
|
os.dup2(f.fileno(), 2)
|
||||||
|
os.execlp('python', 'python', '-m', 'chopy',
|
||||||
|
'--output_dir', split_dir,
|
||||||
|
'--monitor', monitor_dir,
|
||||||
|
'--database', database,
|
||||||
|
'--index_dir', index_dir,
|
||||||
|
'--filter_dir', filter_dir)
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
launch()
|
Loading…
Reference in New Issue