1 Centinel Installation Guide
2 ===========================
4 This document is for the user to install the artifacts that are needed
5 for using Centinel functionality in the OpenDaylight by enabling the
6 default Centinel feature. Centinel is a distributed reliable framework
7 for collection, aggregation and analysis of streaming data which is
8 added in OpenDaylight Beryllium Release.
13 The Centinel project aims at providing a distributed, reliable framework
14 for efficiently collecting, aggregating and sinking streaming data across
15 Persistence DB and stream analyzers (e.g., Graylog, Elasticsearch,
16 Spark, Hive). This framework enables SDN applications/services to
17 receive events from multiple streaming sources
18 (e.g., Syslog, Thrift, Avro, AMQP, Log4j, HTTP/REST).
20 In Beryllium, we develop a "Log Service" and plug-in for log analyzer (e.g., Graylog).
21 The Log service process real time events coming from log analyzer.
22 Additionally, we provide stream collector (Flume- and Sqoop-based) that collects logs
23 from OpenDaylight and sinks them to persistence service (integrated with TSDR).
24 Centinel also includes a RESTCONF interface to inject events to north bound applications
25 for real-time analytic/network configuration. Further, a Centinel User Interface (web interface)
26 will be available to operators to enable rules/alerts/dashboard etc.
28 Pre Requisites for Installing Centinel
29 --------------------------------------
31 * Recent Linux distribution - 64bit/16GB RAM
32 * Java Virtual Machine 1.7 or above
33 * Apache Maven 3.1.1 or above
35 Preparing for Installation
36 --------------------------
38 There are some additional pre-requisites for Centinel, which can be done by integrate
39 Graylog server, Apache Drill, Apache Flume and HBase.
42 Graylog server2 Installation
43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
47 * import the MongoDB public GPG key into apt::
49 sudo apt-key adv --keyserver keyserver.ubuntu.com --recv 7F0CEB10
51 * Create the MongoDB source list::
53 echo 'deb http://downloads-distro.mongodb.org/repo/debian-sysvinit dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list
55 * Update your apt package database::
59 * Install the latest stable version of MongoDB with this command::
61 sudo apt-get install mongodb-org
64 * Install Elasticsearch
66 * Graylog2 v0.20.2 requires Elasticsearch v.0.90.10. Download and install it with these commands::
68 cd ~; wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-0.90.10.deb
69 sudo dpkg -i elasticsearch-0.90.10.deb
71 * We need to change the Elasticsearch cluster.name setting. Open the Elasticsearch configuration file::
73 sudo vi /etc/elasticsearch/elasticsearch.yml
75 * Find the section that specifies cluster.name. Uncomment it, and replace the default value with graylog2::
77 cluster.name: graylog2
79 * Find the line that specifies network.bind_host and uncomment it so it looks like this::
81 network.bind_host: localhost
82 script.disable_dynamic: true
84 * Save and quit. Next, restart Elasticsearch to put our changes into effect::
86 sudo service elasticsearch restart
88 * After a few seconds, run the following to test that Elasticsearch is running properly::
90 curl -XGET 'http://localhost:9200/_cluster/health?pretty=true'
93 * Install Graylog2 server
95 * Download the Graylog2 archive to /opt with this command::
97 cd /opt; sudo wget https://github.com/Graylog2/graylog2-server/releases/download/0.20.2/graylog2-server-0.20.2.tgz
99 * Then extract the archive::
101 sudo tar xvf graylog2-server-0.20.2.tgz
103 * Let's create a symbolic link to the newly created directory, to simplify the directory name::
105 sudo ln -s graylog2-server-0.20.2 graylog2-server
107 * Copy the example configuration file to the proper location, in /etc::
109 sudo cp /opt/graylog2-server/graylog2.conf.example /etc/graylog2.conf
111 * Install pwgen, which we will use to generate password secret keys::
113 sudo apt-get install pwgen
115 * Now must configure the admin password and secret key. The password secret key is configured in graylog2.conf, by the password_secret parameter. Generate a random key and insert it into the Graylog2 configuration with the following two commands::
117 SECRET=$(pwgen -s 96 1)
118 sudo -E sed -i -e 's/password_secret =.*/password_secret = '$SECRET'/' /etc/graylog2.conf
120 PASSWORD=$(echo -n password | shasum -a 256 | awk '{print $1}')
121 sudo -E sed -i -e 's/root_password_sha2 =.*/root_password_sha2 = '$PASSWORD'/' /etc/graylog2.conf
123 * Open the Graylog2 configuration to make a few changes: (sudo vi /etc/graylog2.conf)::
125 rest_transport_uri = http://127.0.0.1:12900/
126 elasticsearch_shards = 1
128 * Now let's install the Graylog2 init script. Copy graylog2ctl to /etc/init.d::
130 sudo cp /opt/graylog2-server/bin/graylog2ctl /etc/init.d/graylog2
132 * Update the startup script to put the Graylog2 logs in /var/log and to look for the Graylog2 server JAR file in /opt/graylog2-server by running the two following sed commands::
134 sudo sed -i -e 's/GRAYLOG2_SERVER_JAR=\${GRAYLOG2_SERVER_JAR:=graylog2-server.jar}/GRAYLOG2_SERVER_JAR=\${GRAYLOG2_SERVER_JAR:=\/opt\/graylog2-server\/graylog2-server.jar}/' /etc/init.d/graylog2
135 sudo sed -i -e 's/LOG_FILE=\${LOG_FILE:=log\/graylog2-server.log}/LOG_FILE=\${LOG_FILE:=\/var\/log\/graylog2-server.log}/' /etc/init.d/graylog2
137 * Install the startup script::
139 sudo update-rc.d graylog2 defaults
141 * Start the Graylog2 server with the service command::
143 sudo service graylog2 start
146 Install Graylog Server using Virtual Machine
147 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
149 * Download the OVA image from link given below and save it to your disk locally:
150 https://github.com/Graylog2/graylog2-images/tree/master/ova
152 * Run the OVA in many systems like VMware or VirtualBox.
158 * Download hbase-0.98.15-hadoop2.tar.gz
160 * Unzip the tar file using below command::
162 tar -xvf hbase-0.98.15-hadoop2.tar.gz
164 * Create directory using below command::
166 sudo mkdir /usr/lib/hbase
168 * Move hbase-0.98.15-hadoop2 to hbase using below command::
170 mv hbase-0.98.15-hadoop2/usr/lib/hbase/hbase-0.98.15-hadoop2 hbase
172 * Configuring HBase with java
174 * Open your hbase/conf/hbase-env.sh and set the path to the java installed in your system::
176 export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_25
178 * Set the HBASE_HOME path in bashrc file
180 * Open bashrc file using this command::
184 * In bashrc file append the below 2 statements::
186 export HBASE_HOME=/usr/lib/hbase/hbase-0.98.15-hadoop2
188 export PATH=$PATH:$HBASE_HOME/bin
190 * To start HBase issue following commands::
192 HBASE_PATH$ bin/start-hbase.sh
194 HBASE_PATH$ bin/hbase shell
196 * Create centinel table in HBase with stream,alert,dashboard and stringdata as column families using below command::
198 create 'centinel','stream','alert','dashboard','stringdata'
200 * To stop HBase issue following command::
202 HBASE_PATH$ bin/stop-hbase.sh
205 Apache Flume Installation
206 -------------------------
208 * Download apache-flume-1.6.0.tar.gz
210 * Copy the downloaded file to the directory where you want to install Flume.
212 * Extract the contents of the apache-flume-1.6.0.tar.gz file using below command. Use sudo if necessary::
214 tar -xvzf apache-flume-1.6.0.tar.gz
218 * Navigate to the Flume installation directory.
219 * Issue the following command to start flume-ng agent::
221 ./flume-ng agent --conf conf --conf-file multiplecolumn.conf --name a1 -Dflume.root.logger=INFO,console
224 Apache Drill Installation
225 -------------------------
227 * Download apache-drill-1.1.0.tar.gz
229 * Copy the downloaded file to the directory where you want to install Drill.
231 * Extract the contents of the apache-drill-1.1.0.tar.gz file using below command::
233 tar -xvzf apache-drill-1.1.0.tar.gz
237 * Navigate to the Drill installation directory.
239 * Issue the following command to launch Drill in embedded mode::
243 * Access the Apache Drill UI on link: http://localhost:8047/
245 * Go to "Storage" tab and enable "HBase" storage plugin.
253 * Use the following command to download git repository of Centinel::
255 git clone https://git.opendaylight.org/gerrit/p/centinel
257 * Navigate to the installation directory and build the code using maven by running below command::
261 * After building the maven project, a jar file named ``centinel-SplittingSerializer-0.0.1-SNAPSHOT.jar``
262 will be created in ``centinel/plugins/centinel-SplittingSerializer/target`` inside the workspace directory.
263 Copy and rename this jar file to ``centinel-SplittingSerializer.jar`` (as mentioned in configuration file of flume)
264 and save at location ``apache-flume-1.6.0-bin/lib`` inside flume directory.
266 * After successful build, copy the jar files present at below locations to ``/opt/graylog/plugin`` in graylog server(VM)::
268 centinel/plugins/centinel-alertcallback/target/centinel-alertcallback-0.1.0-SNAPSHOT.jar
270 centinel/plugins/centinel-output/target/centinel-output-0.1.0-SNAPSHOT.jar
272 * Restart the server after adding plugin using below command::
274 sudo graylog-ctl restart graylog-server
280 Make changes to following file::
284 * Uncomment ``$InputTCPServerRun 1514``
286 * Add the following lines::
288 module(load="imfile" PollingInterval="10") #needs to be done just once
290 File="<karaf.log>" #location of log file
291 StateFile="statefile1"
293 *.* @@127.0.0.1:1514 # @@used for TCP
295 * Use the following format and comment the previous one::
297 $ActionFileDefaultTemplate RSYSLOG_SyslogProtocol23Format
299 * Use the below command to send Centinel logs to a port::
301 tail -f <location of log file>/karaf.log|logger
303 * Restart rsyslog service after making above changes in configuration file::
305 sudo service rsyslog restart
308 Install the following feature
309 -----------------------------
311 Finally, from the Karaf console install the Centinel feature with this command::
313 feature:install odl-centinel-all
316 Verifying your Installation
317 ---------------------------
319 If the feature install was successful you should be able to see the following Centinel commands added::
328 Check the ``../data/log/karaf.log`` for any exception related to Centinel related features
330 Upgrading From a Previous Release
331 ---------------------------------
333 Beryllium being the first release supporting Centinel functionality, only fresh installation is possible.
335 Uninstalling Centinel
336 ---------------------
338 To uninstall the Centinel functionality, you need to do the following from Karaf console::
340 feature:uninstall centinel-all
342 Its recommended to restart the Karaf container after uninstallation of the Centinel functionality.