Configure ElasticSearch (/etc/elasticsearch/elasticsearch.yml)
Enable port 9200
$sudo ufw allow 9200
$sudo ufw status
Configure Logstash (/etc/logstash/logstash.yml)
Install jruby if it does not exist
$sudo apt install jruby
Configure FileBeats (/etc/filebeat/filebeat.yml)
Configure Kibana
(requires Oracle Java 8)
Fetch logs file under /var/log/*.log and send to LogStash
Create a new Logstash processing pipeline (/etc/logstash/config/first-pipeline.conf)
Enable port 5044
$sudo ufw allow 5044
Run LogStash server (/usr/share/logstash)
$sudo bin/logstash -f /etc/logstash/config/first-pipeline.conf --config.reload.automatic
Run FileBeats (/usr/share/filebeat)
$sudo bin/filebeat -e -c /etc/filebeat/filebeat.yml -d "publish"
Show indices from ElasticSearch
$curl http://localhost:9200/_cat/indices?v
Retrieve some results from the logs:
http://localhost:9200/logstash-2018.02.23/_search?q=sudo&pretty=true&size=50
Delete an index namely twitter in ES
$curl -XDELETE 'localhost:9200/twitter?pretty'
Remove the Filebeat's data registry to force re-indexing
$rm -f /usr/share/filebeat/bin/data/registry
Find an logstash instance : $ps -aux | grep "logstash"
Kill processes : $sudo kill -9 18543 18544
References:
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html