I have ubuntu 15.10 os. I wrote a shell script to execute multiple commands which are:
Log to root " root@dalya-B5400:/home/hduser " and enter snort directory , and open Ids mode , Convert the captured packets to text format and at last logout from this directory and root, Then log to hadoop user " root@dalya-B5400:/home/hduser " , Start all process and send the snort log file to hadoop.
I'm in account " hduser@dalya-B5400 " which is the normal user. I need some commands to perform in this user: " root@dalya-B5400:/home/hduser " so I used : $ sudo su and logged to it. After finishing my job here, I want to return to the normal user " hduser@dalya-B5400 "
My script worked until log from root to hadoop user, Here I focused a problem, I used this commands (one at time) :
$ su - & sshpass -p password ssh -o StrictHostKeyChecking=no hduser@dalya-B5400
$ sudo -iu hduser
$ sudo su - hduser
Its logged to hadoop user but after that exit without execute the rest commands behind it.
Also I tried to call second shell script from the current , but also it give same result and don't open the normal user.
My primary shell script named snort-command and its contain this:
#!/bin/bash
cd ~/snort5_src
cd snort-2.9.9.0
snort -dev -n 20 -l /home/hduser/log9 -b -c /etc/snort5/snort.conf
chmod a+rwx /home/hduser/log9/snort.log.*
tcpdump -n -tttt -r /home/hduser/log9/snort.log.* > /home/hduser/log9/bigfile2.txt
sshpass -p password ssh -o StrictHostKeyChecking=no hduser@dalya-B5400
/home/hduser/hadoop
and the second shell script named hadoop and contain:
#!/bin/bash
/usr/local/hadoop/bin/start-all.sh
hadoop fs -put /home/hduser/log9/bigfile2.txt user/hduser/li
Also I tried to open new terminal from the current :
$ gnome-terminal
But its also open the current user not the normal one.
Any suggestions ?
I solved this issue by this steps:
1- Add sudo to snort commands without login to root user so I don't need to logout later.
sudo snort -dev -n 20 -l /home/hduser/log9 -b -c /etc/snort5/snort.conf
sudo chmod a+rwx /home/hduser/log9/snort.log.*
sudo tcpdump -n -tttt -r /home/hduser/log9/snort.log.* > /home/hduser/log9/bigfile2.txt
And to run it without a password, add this Line to visudo :
hduser ALL=(ALL) NOPASSWD: ALL
2- At this point it worked until I send the file to hadoop, It give message says:
nameNode in safe mode
I found out the problem was because Job is running before namenode is out of safemode after startup. So I started hadoop process before excuting the script few minutes and Its worked fine .