<description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/home/hdp/hive211/iotmp</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hdp/hive211/iotmp</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/hdp/hive211/iotmp</value>
<description>Temporary local directory for added resources in the remote file system.</description>
[hadoop@hadoop~]$ hive --service help
Usage .
/hive<parameters> --service serviceName <service parameters>
Service List: beelinecli help hiveserver2 hiveserver hwi jar lineage metastore metatool orcfiledumprcfilecat schemaTool version
Parametersparsed:
--auxpath : Auxillary jars
--config : Hive configuration directory
--service : Starts specificservice/component. cli is default
Parameters used:
HADOOP_HOME or HADOOP_PREFIX : Hadoop installdirectory
Logging initialized using configuration in jar:file:/home/hdp/hive211/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Hive
-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X>
hive
>
Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized.
Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed,
don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)
解决:
#数据库的初始化。
bin
/schematool -initSchema -dbType mysql
启动:
bin/hive
启动后mysql 多了hive 数据库
测试
创建数据库
create database db_hive_test;
创建测试表
use db_hive_test;
create table student(id int,name string) row format delimited fields terminated by '\t';
加载数据到表中
新建student.txt 文件写入数据(id,name 按tab键分隔)
vi student.txt
[html] view plain copy
1001 zhangsan
1002 lisi
1003 wangwu
1004 zhaoli
load data local inpath '/home/hadoop/student.txt' into table db_hive_test.student
查询表信息
select * from student;
查看表的详细信息
desc formatted student;
通过Mysql查看创建的表
查看hive的函数
show functions;
查看函数详细信息
desc function sum;
desc function extended
[hdp@hdp265m ~]$ ant -version
Apache Ant(TM) version
1.9.7 compiled on April 9 2016
启动hwi
hive --service hwi
如果有下面的错误
The following error occurred while executing this line:
jar:file:/home/linux/application/hive2.1.0/lib/ant-1.9.1.jar!/org/apache/tools/ant/antlib.xml:37: Could not create task or type of type: componentdef.