REVISED on February 23, 2015 due to several minor changes with the new packages.
Overview of Setup:
Logstash Server:
Ubuntu 14.04 LTS with 4gb of RAM
Part 1 > Install OpenJDK
- Install OpenJDK
$ sudo apt-get update $ sudo apt-get install openjdk-7-jre-headless
Part 2 > Install Logstash (The Indexer)
This is on the log server side. It indexers the logs and pipes them into elasticsearch.
- Download Logstash & install the files
wget https://download.elasticsearch.org/logstash/logstash/packages/debian/logstash_1.4.2-1-2c0f5a1_all.deb dpkg -i logstash_1.4.2-1-2c0f5a1_all.deb
- Generate SSL Certs
mkdir -p /etc/pki/tls/certs mkdir /etc/pki/tls/private
- Add the host as a CA in the [v3_ca] section:
nano /etc/ssl/openssl.cnf subjectAltName = IP:ipaddressofhost
- Next, generate the certificate and private key:
cd /etc/pki/tls; sudo openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt
- Later, we’ll copy that key to each server that will be forwarding logs to logstash.
- Next, we’ll configure Logstash. Config files should be placed in /etc/logstash/conf.d/
- First, create an input config, we’ll name it 01-lumberjack-input.conf, which 01 will place it first in line to be read by logstash.
nano /etc/logstash/conf.d/01-lumberjack-input.conf
- Place this in the lumberack input conf:
input { lumberjack { port => 5000 type => "logs" ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt" ssl_key => "/etc/pki/tls/private/logstash-forwarder.key" } }
- Next, let’s create a filter for syslog messages:
nano /etc/logstash/conf.d/10-syslog.conf
filter { if [type] == "syslog" { grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } syslog_pri { } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } } }
- Grok will parse the messages based on the above specifications which will make the logs structured and searchable inside Kibana.
- For the last component, we’ll create the lumberjack output config file:
nano /etc/logstash/conf.d/30-lumberjack-output.conf
output { elasticsearch { host => localhost } stdout { codec => rubydebug } }
- Additional filters need to be created for each type of log (e.g. Apache). You can created additional ones later, with a filename between 01 and 30 so that it’s sorted between the input and output configuration files.
- Restart logstash
service logstash restart
- Disable logstash built in web frontend:
service logstash-web stop update-rc.d -f logstash-web remove
Part 3 > Install Elasticsearch
- Download and install elasticsearch
wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.4.2.deb dpkg -i elasticsearch-1.4.2.deb
- Edit Elasticsearch config to allow Kibana to speak with it, add this at the end of /etc/elasticsearch/elasticsearch.yml
http.cors.enabled: true http.cors.allow-origin: "/.*/" script.disable_dynamic: true
- Restart elasticsearch
service elasticsearch restart
Part 4 > Install Kibana (Web Frontend)
- Download Kibana package, unpack and move to /var/www folder
wget https://download.elasticsearch.org/kibana/kibana/kibana-3.1.2.tar.gz tar xvf kibana* mv kibana-3.1.2 /var/www/kibana
- Edit config.js in /var/www/kibana/ and replace port 9200 with 80:
elasticsearch: "http://"+window.location.hostname+":80",
- Create a virtualhosts file for Kibana in Apache2 for /var/www/kibana
Part 5 > Install Logstash Forwarder
UPDATE: As of 2/16/2015, the deb repo was taken down. I need figure out the steps to compile from the master branch, since that seems to be the only way. Full discussion here.
UPDATE (02/23/2015): Here are the steps to compile from master.
Do these steps on each server:
- Copy crt from Logstash server to each forwarding machine:
scp /etc/pki/tls/certs/logstash-forwarder.crt username@remoteip:/tmp
- Compile from source:
- Download from github the zip file: https://github.com/elasticsearch/logstash-forwarder
- Unzip and cd to the directory.
- Make sure you have the compiling tools, if not:
- apt-get install gccgo-go
- # go build
# mkdir -p /opt/logstash-forwarder/bin/ - # mv logstash-forwarder-master /opt/logstash-forwarder/bin/logstash-forwarder
- Install the init script to get Logstash Forwarded to start on bootup:
cd /etc/init.d/; sudo wget https://raw.github.com/elasticsearch/logstash-forwarder/master/logstash-forwarder.init -O logstash-forwarder sudo chmod +x logstash-forwarder sudo update-rc.d logstash-forwarder defaults
- Copy the certs over:
mkdir -p /etc/pki/tls/certs cp /tmp/logstash-forwarder.crt /etc/pki/tls/certs/
- Create and edit the logstash forwarder config file:
nano /etc/logstash-forwarder { "network": { "servers": [ "logstashserverip:5000" ], "timeout": 15, "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt" }, "files": [ { "paths": [ "/var/log/syslog", "/var/log/auth.log" ], "fields": { "type": "syslog" } } ] }
- Restart the service on each forwarding machine and check Kibana to see that they are successfully shipping their logs.
service logstash-forwarder restart