您现在的位置是:首页 > 技术人生 > 服务器相关服务器相关

Ubuntu 20.04安装hive 3.1.2

高晓波2021-09-24【服务器相关】人已围观

简介此前我们已经成功安装了hadoop3.1.3和mariadb 10.2.10,可以参照前面两篇《Ubuntu 20.04 安装Hadoop 3.1.3集群》、《Ubuntu 20.04二进制tar包安装mariadb 10.2.10》

此前我们已经成功安装了hadoop3.1.3和mariadb 10.2.10,可以参照前面两篇《Ubuntu 20.04 安装Hadoop 3.1.3集群》《Ubuntu 20.04二进制tar包安装mariadb 10.2.10》


1、下载hive安装包

cd /usr/local/src/

sudo wget https://downloads.apache.org/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz


2、解压

sudo tar -zxvf /usr/local/src/apache-hive-3.1.2-bin.tar.gz -C /usr/local/

sudo mv /usr/local/apache-hive-3.1.2-bin /usr/local/apache-hive-3.1.2


3、修改hive配置文件

sudo cp /usr/local/apache-hive-3.1.2/conf/hive-default.xml.template /usr/local/apache-hive-3.1.2/conf/hive-site.xml

这里我们采用MariaDB保存Hive的元数据,而不是采用Hive自带的derby来存储元数据。

我们先在MariaDB创建好hive对应的数据库和用户,然后修改以下配置:
 
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://10.11.99.15:3306/db_hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>数据库用户名</value>
    <description>username to use against metastore database</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>数据库密码</value>
    <description>password to use against metastore database</description>
  </property>


4、修改hive文件夹归属

sudo chgrp -R hadoop /usr/local/apache-hive-3.1.2/
sudo chown -R hadoop /usr/local/apache-hive-3.1.2/


5、启动hive

请确认在启动hive之前,已启动hadoop集群。
/usr/local/apache-hive-3.1.2/bin/hive

报错:
(1)java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument
hive报错

原因:
com.google.common.base.Preconditions.checkArgument 这是因为hive内依赖的guava.jar和hadoop内的版本不一致造成的。

解决:
  1.查看hadoop安装目录下share/hadoop/common/lib内guava.jar版本
  2.查看hive安装目录下lib内guava.jar的版本 如果两者不一致,删除版本低的,并拷贝高版本的 问题解决!

(2) com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8


原因:3215行包含不能解析的字符。

hive启动报错

解决:
错误信息是解析失败,我们找到对应行,将不能解析的转义字符删掉。

(3) java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
hive启动报错


原因:
我们此前在拷贝默认的配置文件时,没有加入路径的配置,导致hdfs路径异常。

解决:
网上很多文章说替换掉配置文件中的${system:java.io.tmpdir},相比之下加入配置更为合适。

在hive.site.xml中加入如下配置:
 
  <property>
    <name>system:java.io.tmpdir</name>
    <value>/tmp/hive</value>
  </property>
  <property>
    <name>system:user.name</name>
    <value>${user.name}</value>
  </property>

(4)The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH
2021-09-24 03:25:11,951 ERROR DataNucleus.Datastore: Exception thrown creating StoreManager. See the nested exception
Error creating transactional connection factory
org.datanucleus.exceptions.NucleusException: Error creating transactional connection factory
        at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:214)
        at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:162)
        at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:285)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
        at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:422)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:817)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:334)
        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:213)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:650)
        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:693)
        at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:483)
        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:420)
        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:375)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:718)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:696)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:690)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:767)
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:538)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
        at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8667)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:94)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4299)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4367)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4347)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4603)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:291)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:274)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:435)
        at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:375)
        at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:355)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:331)
        at org.apache.hadoop.hive.ql.metadata.HiveMaterializedViewsRegistry.init(HiveMaterializedViewsRegistry.java:133)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:330)
        at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:203)
        ... 75 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "HikariCP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:232)
        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:117)
        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82)
        ... 82 more

原因:
缺少jdbc驱动

解决:
下载mariadb java驱动,放入lib包下。
wget https://downloads.mariadb.com/Connectors/java/connector-java-2.7.3/mariadb-java-client-2.7.3.jar

cp mariadb-java-client-2.7.3.jar ./lib/
再次启动依然报错,发现mariadb驱动名称写错,conf/hive.site.xml修改驱动名称
 
  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>org.mariadb.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>

(5)Required table missing : "VERSION" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations.

解决:
进入hive安装目录,执行如下命令:./bin/schematool -dbType mysql -initSchema
./bin/schematool -dbType mysql -initSchema
成功启动后会进入Hive的交互式执行环境,如下命令提示符:
hive> 
可以在里面输入SQL语句,如果要退出Hive交互式执行环境,可以输入如下命令:
hive> exit;

 

Tags:hive   hadoop

很赞哦! ()

文章评论