Install Hive 2.1.1 and Configure Mysql metastore for Hive
Install Hive 2.1.1 and Configure Mysql metastore for Hive
Step 1:- Download Hive 2.1.1
Step 2:- Extract it
Step 3:- Download mysql-connector-java-5.1.30 extract it and copy mysql-connector-java-5.1.30- bin.jar to lib directory in Hive
step 4:- Delete log4j-slf4j-impl-2.4.1.jar jar file from lib directory which is under apache-hive-2.1.1-bin directory
Step 5:- Start Hadoop and create HDFS directories
$start-dfs.sh
$start-yarn.sh
$hdfs dfs -mkdir -p /user/hive/warehouse
$hdfs dfs -mkdir -p /tmp/hive
Step 6:- Change Dirctory Permissions
$hdfs dfs -chmod 777 /tmp/
$hdfs dfs -chmod 777 /user/hive/warehouse
$hdfs dfs -chmod 777 /tmp/hive
Step 7:- Install mysql
$sudo apt-get install mysql-server
Step 8:- Create Database Metastore
$ mysql -u root -p
Enter password:
mysql> CREATE DATABASE metastore_db;
mysql> USE metastore_db;
/* Now you need to run the script , for that you have to specify the location of script file on your system*/
mysql> SOURCE /home/<username>/Downloads/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql;
Step 9:- Create Hiveuser and Hivepassword
mysql> CREATE USER 'hiveuser'@'%' IDENTIFIED BY 'hivepassword';
mysql> GRANT all on *.* to 'hiveuser'@localhost identified by 'hivepassword';
mysql> flush privileges;
Step 10 :- Get into conf directory under apache-hive-2.1.1-bin folder and rename hive-default.xml.template to hive-site.xml and hive-env.sh.template to hive-env.sh
1.In hive-site.xml
Change following properties
a) ConnectionURL
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true</value>
b) ConnectionUserName
<name>javax.jdo.option.ConnectionUserName</name>
<value>hiveuser</value>
c) ConnectionPassword
<name>javax.jdo.option.ConnectionPassword</name>
<value>hivepassword</value>
2. In hive-env.sh (append HADOOP_HOME at end of file)
export HADOOP_HOME=(Location of Hadoop on your system)
3. Replace following values in hive-site.xml
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>${system:java.io.tmpdir}/${system:user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>${system:java.io.tmpdir}/${hive.session.id}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
With these values
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/tmp/${user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/tmp/${user.name}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
Step 11:- Set Path in bashrc
$sudo gedit ~/.bashrc
Add these lines at end of file
export HIVE_HOME=/home/hadoop/Downloads/apache-hive-2.1.1-bin
step 12 :- $schematool -initSchema -dbType mysql
Step 13 :- $hive
And you are in Hive shell 😇
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
DeleteI'm getting this below error in the last step, please advice
ReplyDeleteschematool -initSchema -dbType mysql
Metastore connection URL: jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hiveuser
Starting metastore schema initialization to 2.1.0
Initialization script hive-schema-2.1.0.mysql.sql
Error: Duplicate key name 'PCS_STATS_IDX' (state=42000,code=1061)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
same here
Deletesame
DeleteDrop database metastore_db;
DeleteJust skip the step that involves command "SOURCE /", It will automatically create tables
DeleteGood Tutorial.I also faced the above error @ Vigneshwaran R .I Resolved the error by not explicitly creating a metastore database.As Hive will automatically create metastore database when schema is initialised .So all you just have to skip metastore commands.Thes commands are not be run-
ReplyDeletemysql> CREATE DATABASE metastore_db;
mysql> USE metastore_db;
/* Now you need to run the script , for that you have to specify the location of script file on your system*/
mysql> SOURCE /home//Downloads/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql;
Rest is fine.Then you can start the HIVE SHELL.
Thanks Hasan !!, you made my day !!
DeleteI drooped the databases that I created and reran the command : "schematool -initSchema -dbType mysql", it worked perfectly then after !!
God bless u Hasan, you have rescued from these stresses. I dropped metastore_db database that i created, then run "schematool -initSchema -dbType mysql", and it worked.
Deleteit worked
DeleteThanks man...it worked for me
DeleteGood blog!!! I tried this on CentOS 7, and it works fine. but please remove some line if it's not required -
ReplyDeletemysql> CREATE DATABASE metastore_db;
mysql> USE metastore_db;
mysql> SOURCE /home//hive-2.1.1/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql;
Keep it up @Sandi :)
Good Tutorial When I am running command "$schematool -initSchema -dbType mysql", I am getting message schematool: command not found. How to resolve ?
ReplyDeleteverify HIVE_HOME in your bashrc
DeleteStep 12 is not working for me as well,
DeleteOS Ubuntu 18.04 and Hadoop version - 2.9.0 single standalone
as run following in terminal
$schematool -initSchema -dbType mysql
getting following error
"command not found"
I checked .bashrc file as well there corrected HIVE_HOME path is set.
sometimes HIVE_HOME in bashrc doesn't work.
Deletetry export PATH=$PATH:{path to your bin directory of hive}
Hello Sandeep
ReplyDeleteYou have done a great help to me by posting this step by step installion in this blog.
I have wasted 3 days time to install Hive2-2.1.1 in Hadoop 2.x.
I have searched lot of stuff in google. No one resolved my issue.
Finally you saved me and my time.
Thanks Sandeep.
I am getting this error while creating new database or show databases;
ReplyDeleteFAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Did you manage to solve this issue? Best regards.
DeleteHi, I am getting below error : please help me on this .I not able to overcome this problem :
ReplyDeleteException in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/JobConf
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:3694)
at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:3652)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:82)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:66)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:657)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
Thanks bhava!
ReplyDeleteGood initiative.
keep it up!
Appreciation for really being thoughtful and also for deciding on certain marvelous guides most people really want to be aware of.
ReplyDeleteBest Hadoop Training Institute In chennai
amazon-web-services-training-in-bangalore
HI...The tutorial is good...however, after extracting & configure at the moment of preparing the schema with (schematool -initSchema -dbType mysql) I get the error:
ReplyDelete--------------
2018-05-16 11:48:19,991 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
2018-05-16 11:48:19,991 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
org.apache.hadoop.hive.metastore.HiveMetaException: File /usr/hive\scripts\metastore\upgrade\mysql\upgrade.order.mysqlnot found
Underlying cause: java.io.FileNotFoundException : \usr\hive\scripts\metastore\upgrade\mysql\upgrade.order.mysql (El sistema no puede encontrar la ruta especificada)
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
-------------------
I believe the main problem is the path, where is defined the hive\scripts\metastore\upgrade path??? seems is not reading it well
Thanks
Step 12 is not working for me as well,
ReplyDeleteOS Ubuntu 18.04 and Hadoop version - 2.9.0 single standalone
as run following in terminal
$schematool -initSchema -dbType mysql
getting following error
"command not found"
I checked .bashrc file as well there corrected HIVE_HOME path is set
in step two we also need to add following
Deleteexport PATH=$PATH:$HIVE_HOME/bin
All steps done with success but still when i write command hive in terminal then got following error. any one know what is the solution?
ReplyDeleteException in thread "main" java.lang.ClassCastException: java.base/jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to java.base/java.net.URLClassLoader
at org.apache.hadoop.hive.ql.session.SessionState.(SessionState.java:387)
.
.
.
..
sir , I am getting this error in the last step
ReplyDeleteMetastore connection URL: jdbc:mysql://localhost/metastore_db
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hiveuser
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : The server time zone value 'PDT' is unrecognized or represents more than one time zone. You must configure either the server or JDBC driver (via the serverTimezone configuration property) to use a more specifc time zone value if you want to utilize time zone support.
SQL Error code: 0
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
I am getting the following issue :-
ReplyDeletehadoop1@phanikishore-Inspiron-5558:~/Desktop$ schematool -initSchema -dbType mysql
Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3210,96,"file:/home/hadoop1/Desktop/apache-hive-3.1.0-bin/conf/hive-site.xml"]
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3003)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2931)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2806)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1460)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:4990)
at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:5063)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5150)
at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:5098)
at org.apache.hive.beeline.HiveSchemaTool.(HiveSchemaTool.java:96)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Caused by: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
at [row,col,system-id]: [3210,96,"file:/home/hadoop1/Desktop/apache-hive-3.1.0-bin/conf/hive-site.xml"]
at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
at com.ctc.wstx.sr.StreamScanner.reportIllegalChar(StreamScanner.java:2456)
at com.ctc.wstx.sr.StreamScanner.validateChar(StreamScanner.java:2403)
at com.ctc.wstx.sr.StreamScanner.resolveCharEnt(StreamScanner.java:2369)
at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java:1515)
at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2828)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123)
at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3257)
at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3063)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2986)
... 15 more
Can anyone please help me with this ?
This comment has been removed by the author.
ReplyDeleteDigital Marketing still needs more attention than it deserves. Good that your article is helping the cause.
ReplyDeleteHadoop Training In Navalur
java training in navalur
I was very pleased to find this site.I want to thank you for this great content!! I enjoyed every little bit of it and I have you bookmarked to check out new stuff you post.
ReplyDeleteSelenium Course in Chennai
Selenium training institute in Chennai
Big Data Training in Chennai
web designing training in chennai
PHP Training Institute in Chennai
php course
Thanks for sharing this awesome post with us. I really like it.
ReplyDeleteSpark Training in Chennai | Spark Training | LINUX Training in Chennai | JavaScript Training in Chennai | Unix Training in Chennai | Oracle Training in Chennai | Oracle DBA training
Im not sure whats this for.But the installation procedures you gave for step by step is very easy to capture.
ReplyDeletemobile service center in velachery
mobile service center in vadapalani
mobile service center in porur
best mobile service center
While running the command, $schematool -initSchema -dbType mysql
ReplyDeleteI am getting the below error:
Exception in thread "main" java.lang.IllegalAccessError: tried to access field org.slf4j.impl.StaticLoggerBinder.SINGLETON from class org.slf4j.LoggerFactory
at org.slf4j.LoggerFactory.(LoggerFactory.java:60)
at org.apache.hive.beeline.HiveSchemaTool.(HiveSchemaTool.java:65)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Can someone help how to resolve this issue.
I am impressed by the information
ReplyDeleteaws course in Bangalore that you have on this blog. It shows how well you understand this subject.
Informative post indeed, I’ve being in and out reading posts regularly and I see alot of engaging people sharing things and majority of the shared information is very valuable and so, here’s my fine read.
ReplyDeleteclick here clipart
click here css
click here call to action
click here css animation
click here to 2lz4drob
i am getting an error in step 9 , i created the user but when i am granting the privilege i am getting an error of access denied , please help me in this out.
ReplyDeletemysql> CREATE USER 'hvuser'@'%' IDENTIFIED BY 'apple';
Query OK, 0 rows affected (0.00 sec)
mysql> GRANT all on *.* to 'hvuser'@localhost identified by 'apple';
ERROR 1698 (28000): Access denied for user 'hduser'@'localhost'
mysql> GRANT all on *.* to 'hvuser'@localhost identified by 'apple';
ERROR 1698 (28000): Access denied for user 'hduser'@'localhost'
mysql> GRANT all privileges on *.* to 'hvuser'@localhost identified by 'apple';
ERROR 1698 (28000): Access denied for user 'hduser'@'localhost'
ReplyDeleteVery excellent post.The knowledge you have been sharing through this post is very helpul to bring up new ideas and to innovate big things.I suggesst everyone to go through this blog,you not only get knowledge on it but enjoy reading the post.Thanks for sharing.
Python Certification Course in Bangalore
Subscription boxes are a type of boxes which are delivered to the regular customers in order to build goodwill of the brand. They are also a part of the product distribution strategy. As a woman, you should subscribe to these boxes to bless yourself with a new and astonishing box of happiness each month. visit mysubscriptionsboxes
ReplyDeleteI am gather this coding more information.It's helpful for me my friend. Also great blog here with all of the valuable information you have.
ReplyDeleteaws training in chennai | aws training in annanagar | aws training in omr | aws training in porur | aws training in tambaram | aws training in velachery
This is excellent information. It is amazing and wonderful to visit your site.Thanks for sharng this information,this is useful to me...
ReplyDeleteBig Data Hadoop Training In Chennai | Big Data Hadoop Training In anna nagar | Big Data Hadoop Training In omr | Big Data Hadoop Training In porur | Big Data Hadoop Training In tambaram | Big Data Hadoop Training In velachery
Hi all, I am getting the following error. Any clue?
ReplyDeleteMetastore connection URL: jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hiveuser
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException : Could not create connection to database server.
SQL Error code: 0
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
Nice post. Thanks for sharing! I want people to know just how good this information is in your article. It’s interesting content and Great work.
ReplyDeletefor Affordable Interior Designer In Gurgaon
visit - lookobeauty
https://lookobeauty.com/best-interior-designer-in-gurgaon/
Lookobeauty
ReplyDeletehttps://lookobeauty.com/makeup-artist-institute-makeup-artist-course-in-gurgaon/
Looking For Best Makeup Artist Course In Gurgaon. Best Makeup Artist Institute With Affordable Fees, Best Placement Record By Top Makeup Teachers In Gurgaon.
I am getting this error :
ReplyDeleteorg.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : Unknown system variable 'query_cache_size'