Search code examples
servicehivehadoop2

hiveserver2 hangs while starting. Nothing listening on port 10000


I have been trying to set up a local single big data node of Apache suite. I was successful in setting up hadoop and hdfs and yarn are working fine. However I have been trying to get Hive up and running for last few hours with no luck. When I say "hive --services hiveserver2", after printing out a few lines it hangs. I checked if anything is listening on port 10000 but there is none. Below is the output of the command "hive --services hiveserver2"

2019-07-27 17:55:54: Starting HiveServer2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/particle/apache-hive-2.3.5-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/particle/hadoop-2.9.2/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Since the hive-site.xml is big, I can not past it here but if you suspect any property, please let me know and I will paste it here. I have installed Hive from the tarball and not maven.

I guess all this has to do with some sort of SLF4J binding conflicts as mentioned here but I do not know the steps required to deal with it. Help will be really appreciated.


Solution

  • I tried many times to make the Hive working through the tar-ball and Maven installation. One or the other thing, it did not happen.

    Here is how I got the hive to get working. (03rd August, 2019).

    First I downloaded the latest Hive .tar.gz file (3.1.1 as of today). After downloading, I made sure that the following was set in ~/.bashrc file. Please note that Java version was 1.8 and Hadoop version was 2.9.2. Not sure if the versions are important but these settings are what worked for me.

    export JAVA_HOME=/home/particle/jdk1.8.0_221
    export PATH=$JAVA_HOME/bin:$PATH
    export HADOOP_HOME=/home/particle/hadoop-2.9.2
    export PATH=$HADOOP_HOME/bin:$PATH
    export HIVE_HOME=/home/particle/apache-hive-3.1.1-bin
    export PATH=$HIVE_HOME/bin:$PATH
    

    After that, I sourced the ~/.bashrc file (as below).

    source ~/.bashrc
    

    Before proceeding further, I made sure that the DFS and YARN and running. If not, please start with $HADOOP_HOME/sbin/start-dfs.sh and $HADOOP_HOME/sbin/start-yarn.sh. Confirm with jps that Namenode, SecondaryNameNode, DataNode, ResourceManager and NodeManager are running.

    Then I created a few directories and set the ownership right. Some of the directories might already exist for you, so don't worry.

    hadoop fs -mkdir       /tmp
    hadoop fs -mkdir       /user
    hadoop fs -mkdir       /user/hive/
    hadoop fs -mkdir       /user/hive/warehouse
    hadoop fs -chmod g+w   /tmp
    hadoop fs -chmod g+w   /user/hive/warehouse
    

    After that I initialized the derby db, by typing the follows. Not sure if this is necessary but did it anyways.

    $HIVE_HOME/bin/schematool -dbType derby -initSchema
    

    After that, I created a file called 'hive-site.xml' and placed in the $HIVE_HOME/conf. The file had the following contents. Make sure you do appropriate changes. What we are going to do below, is set up things for using MySQL as the db for HIVE.

    <?xml version="1.0" encoding="UTF-8" standalone="no"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
       Licensed to the Apache Software Foundation (ASF) under one or more
       contributor license agreements.  See the NOTICE file distributed with
       this work for additional information regarding copyright ownership.
       The ASF licenses this file to You under the Apache License, Version 2.0
       (the "License"); you may not use this file except in compliance with
       the License.  You may obtain a copy of the License at
    
           http://www.apache.org/licenses/LICENSE-2.0
    
       Unless required by applicable law or agreed to in writing, software
       distributed under the License is distributed on an "AS IS" BASIS,
       WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
       See the License for the specific language governing permissions and
       limitations under the License.
    -->
    <configuration>
    <property>
      <name>hadoop.proxyuser.TypeYourUserNameHereForTheOSOrVirtualOS.groups</name>
      <value>*</value>
    </property>
    <property>
      <name>hadoop.proxyuser.TypeYourUserNameHereForTheOSOrVirtualOS.hosts</name>
      <value>*</value>
    </property>
    
    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
    </property>
    
    
    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
    </property>
    
    
    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>TypeYourUserNameHereForTheOSOrVirtulOS_YouWillShortlyCreateThisUserInMySQL</value>
    </property>
    
    <property>
      <name>javax.jdo.option.ConnectionPassword</name>
      <value>TypeYourPassword_YouWillShortlyCreateThisPassWordInMySQL</value>
    </property>
    
    <property>
      <name>datanucleus.autoCreateSchema</name>
      <value>true</value>
    </property>
    
    <property>
      <name>datanucleus.fixedDatastore</name>
      <value>true</value>
    </property>
    
    <property>
     <name>datanucleus.autoCreateTables</name>
     <value>True</value>
     </property>
    </configuration>
    

    After that I installed MySQL client and server. Because I was using Ubuntu so my command was simple as below. This step will change depending upon your OS.

    sudo apt-get install mysql-client mysql-server
    

    After that I downloaded the driver that needed to make HIVE work. I used this link (https://dev.mysql.com/downloads/connector/j/5.1.html) but this may vary depending upon which MySQL version you are using. Once you unzip/untar it, you will see two (or more) .jar files inside the unzipped/untarred folder. I copied both (or all) of them (.jar files) in $HIVE_HOME/lib.

    After that I logged into mysql. Since I did not have an existing password set at the time of installation, I had to use the following commands. If you already have a root/admin login and password then you can skip the first three lines. Please make sure that you have changed into the following directory first "$HIVE_HOME/scripts/metastore/upgrade/mysql/".

    sudo mysql -u root
    # It will first ask the super password associated with sudo
    # Then it will either take you straight to the mysql prompt or it will ask for a password.
    
    USE mysql;
    CREATE USER 'YourUserNameThatYouSavedIn_hive-site.xml'@'localhost' IDENTIFIED BY 'YourPasswordThatYouSavedIn_hive-site.xml';
    GRANT ALL PRIVILEGES ON *.* TO 'YourUserNameThatYouSavedIn_hive-site.xml'@'localhost';
    FLUSH PRIVILEGES;
    EXIT;
    

    Now I had a MySQL account that matches with my settings in hive-site.xml. I logged into the MySQL with the above created Username and password and typed the following commands, one at a time.

    Once into the mysql shell, type the below lines, one at a time.

    DROP DATABASE IF EXISTS hive;
    CREATE DATABASE hive;
    USE hive;
    SOURCE hive-schema-3.1.0.mysql.sql;
    EXIT;
    

    Then I typed the following command in bash shell. A lot of warnings came.

    $HIVE_HOME/bin/schematool -dbType mysql --initSchema
    

    Then I finally went to the magic command.

    $HIVE_HOME/bin/beeline -u jdbc:hive2://
    

    After alot of warnings, it took me to the command prompt I waited for. Below is the result. Still a lot of warnings but I will take it for the time being. I have been after this for last 3 days.

    Final Output