|
在自己虚拟机上使用mysql作为hive元数据存储, 修改配置如下:
- javax.jdo.option.ConnectionURL
- jdbc:mysql://localhost:3306/metastore
- JDBC connect string for a JDBC metastore
- javax.jdo.option.ConnectionDriverName
- com.mysql.jdbc.Driver
- Driver class name for a JDBC metastore
- javax.jdo.option.ConnectionUserName
- hive
- username to use against metastore database
- javax.jdo.option.ConnectionPassword
- hive
- password to use against metastore database
并将mysql jdbc驱动拷贝到HIVE_HOME/lib目录下.
接下去登陆hive客户端,执行show databases;命令, 异常:
- [ruizhe@localhost ~]$ hive
- Hive history file=/tmp/ruize/hive_job_log_ruize_201204091822_467986476.txt
- hive> show databases;
- FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
- NestedThrowables:
- java.lang.reflect.InvocationTargetException
- FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
- hive>
被这个问题困扰好久, 同样的配置在自己笔记本环境上可以,但在公司机器上异常,
最后网上搜索了一把,发现解决方法如下:
- delete $HADOOP_HOME/build and every thing should be fine
切换到 HADOOP_HOME目录:
- [hadoop@localhost build]$ pwd
- /opt/app/hadoop-0.20.2-cdh3u3/build
- [hadoop@localhost build]$ ls
- ant c++ classes contrib examples hadoop-core-0.20.2-cdh3u3.jar hadoop-tools-0.20.2-cdh3u3.jar ivy src test tools webapps
- [hadoop@localhost build]$
果然在HADOOP_HOME/build目录下有新build的信息(我用ant重新build过hadoop)
直接删除build目录:
- [hadoop@localhost hadoop-0.20.2-cdh3u3]$ rm -rf build
重新进入hive客户端:
- [ruize@localhost ~]$ hive
- Hive history file=/tmp/ruize/hive_job_log_ruize_201204091826_1452110220.txt
- hive> show databases;
- OK
- default
- Time taken: 1.786 seconds
- hive>
这次OK了!
|
|
|