hive 的 test case
src/data目录下面是输入数据
src/ql/src/test/queries下面是测试用例,clientpositive是运行成功的用例,clientnegative是运行失败,返回非0的用例。
src/ql/src/test/results下面是测试用例对应的输出结果。如src/ql/src/test/queries/case_sensitivity.q对应的输出结果是src/ql/src/test/results/case_sensitivity.q.out
测试会运行case_sensitivity.q产生的输出放入build/ql/test/logs/clientpositive/case_sensitivity.q.out文件中,然后通过diff比对build/ql/test/logs/clientpositive/case_sensitivity.q.outsrc/ql/src/test/results/case_sensitivity.q.out两个文件,如果相同则测试通过,如果不同则测试失败。
CREATETABLEsales(nameSTRING,idINT)
ROWFORMATDELIMITEDFIELDSTERMINATEDBY'\t’;
CREATETABLEthings(idINT,nameSTRING)partitionedby(dsstring)
ROWFORMATDELIMITEDFIELDSTERMINATEDBY'\t’;
loaddatalocalinpath'examples/files/sales.txt'INTOTABLEsales;
loaddatalocalinpath'examples/files/things.txt'INTOTABLEthingspartition(ds='2011-10-23’);
loaddatalocalinpath'examples/files/things2.txt'INTOTABLEthingspartition(ds='2011-10-24’);
SELECTname,idFROMsalesORDERBYnameASC,idASC;
SELECTid,nameFROMthingsORDERBYidASC,nameASC;
SELECTname,idFROMsalesLEFTSEMIJOINthingsON(sales.id=things.id)ORDERBYnameASC,idASC;
测试前会事先create一些table,并load一些数据进去。
比如:case_sensitivity.q的代码是
tianzhao@tianzhao-VirtualBox:~/hive/hive-1.1.2/src/ql/src/test/queries/clientpositive$catcase_sensitivity.q
CREATETABLEDEST1(KeyINT,VALUESTRING)STOREDASTEXTFILE;
EXPLAIN
FROMSRC_THRIFT
INSERTOVERWRITETABLEdest1SELECTsrc_Thrift.LINT[1],src_thrift.lintstring[0].MYSTRINGwheresrc_thrift.liNT[0]>0;
FROMSRC_THRIFT
INSERTOVERWRITETABLEdest1SELECTsrc_Thrift.LINT[1],src_thrift.lintstring[0].MYSTRINGwheresrc_thrift.liNT[0]>0;
SELECTDEST1.*FROMDest1;
///上面的HSQL中并没有创建SRC_THRIFT表,但是能够用,那是因为在执行该文件的语句前已经创建好了这个表。
代码在org.apache.hadoop.hive.ql.QTestUtil.java中,它的方法init、createSources等。
从anttest-Dtestcase=TestCliDriver-Dqfile=case_sensitivity.q-Dtest.silent=false在终端的输出也可以看出:
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-08',hr='11')
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcpart
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Loadingdatatotabledefault.srcpartpartition(ds=2008-04-08,hr=11)
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-08',hr='11')
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcpart
[junit]POSTHOOK:Output:default@srcpart@ds=2008-04-08/hr=11
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-08',hr='12')
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcpart
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Loadingdatatotabledefault.srcpartpartition(ds=2008-04-08,hr=12)
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-08',hr='12')
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcpart
[junit]POSTHOOK:Output:default@srcpart@ds=2008-04-08/hr=12
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-09',hr='11')
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcpart
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Loadingdatatotabledefault.srcpartpartition(ds=2008-04-09,hr=11)
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-09',hr='11')
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcpart
[junit]POSTHOOK:Output:default@srcpart@ds=2008-04-09/hr=11
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-09',hr='12')
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcpart
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Loadingdatatotabledefault.srcpartpartition(ds=2008-04-09,hr=12)
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'OVERWRITEINTOTABLEsrcpartPARTITION(ds='2008-04-09',hr='12')
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcpart
[junit]POSTHOOK:Output:default@srcpart@ds=2008-04-09/hr=12
[junit]OK
[junit]PREHOOK:query:CREATETABLEsrcbucket(keyint,valuestring)CLUSTEREDBY(key)INTO2BUCKETSSTOREDASTEXTFILE
[junit]PREHOOK:type:CREATETABLE
[junit]POSTHOOK:query:CREATETABLEsrcbucket(keyint,valuestring)CLUSTEREDBY(key)INTO2BUCKETSSTOREDASTEXTFILE
[junit]POSTHOOK:type:CREATETABLE
[junit]POSTHOOK:Output:default@srcbucket
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket0.txt'INTOTABLEsrcbucket
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket0.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket0.txt
[junit]Loadingdatatotabledefault.srcbucket
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket0.txt'INTOTABLEsrcbucket
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket1.txt'INTOTABLEsrcbucket
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket1.txt
[junit]Loadingdatatotabledefault.srcbucket
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket1.txt'INTOTABLEsrcbucket
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket
[junit]OK
[junit]PREHOOK:query:CREATETABLEsrcbucket2(keyint,valuestring)CLUSTEREDBY(key)INTO4BUCKETSSTOREDASTEXTFILE
[junit]PREHOOK:type:CREATETABLE
[junit]POSTHOOK:query:CREATETABLEsrcbucket2(keyint,valuestring)CLUSTEREDBY(key)INTO4BUCKETSSTOREDASTEXTFILE
[junit]POSTHOOK:type:CREATETABLE
[junit]POSTHOOK:Output:default@srcbucket2
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket20.txt'INTOTABLEsrcbucket2
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket2
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket20.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket20.txt
[junit]Loadingdatatotabledefault.srcbucket2
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket20.txt'INTOTABLEsrcbucket2
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket2
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket21.txt'INTOTABLEsrcbucket2
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket2
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket21.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket21.txt
[junit]Loadingdatatotabledefault.srcbucket2
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket21.txt'INTOTABLEsrcbucket2
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket2
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket22.txt'INTOTABLEsrcbucket2
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket2
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket22.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket22.txt
[junit]Loadingdatatotabledefault.srcbucket2
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket22.txt'INTOTABLEsrcbucket2
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket2
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket23.txt'INTOTABLEsrcbucket2
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@srcbucket2
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/srcbucket23.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/srcbucket23.txt
[junit]Loadingdatatotabledefault.srcbucket2
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/srcbucket23.txt'INTOTABLEsrcbucket2
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@srcbucket2
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'INTOTABLEsrc
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@src
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.txt
[junit]Loadingdatatotabledefault.src
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.txt'INTOTABLEsrc
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@src
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv3.txt'INTOTABLEsrc1
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@src1
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv3.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv3.txt
[junit]Loadingdatatotabledefault.src1
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv3.txt'INTOTABLEsrc1
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@src1
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.seq'INTOTABLEsrc_sequencefile
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@src_sequencefile
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/kv1.seq
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/kv1.seq
[junit]Loadingdatatotabledefault.src_sequencefile
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/kv1.seq'INTOTABLEsrc_sequencefile
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@src_sequencefile
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/complex.seq'INTOTABLEsrc_thrift
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@src_thrift
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/complex.seq
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/complex.seq
[junit]Loadingdatatotabledefault.src_thrift
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/complex.seq'INTOTABLEsrc_thrift
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@src_thrift
[junit]OK
[junit]PREHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/json.txt'INTOTABLEsrc_json
[junit]PREHOOK:type:LOAD
[junit]PREHOOK:Output:default@src_json
[junit]Copyingdatafromfile:/home/tianzhao/apache/hive-trunk/data/files/json.txt
[junit]Copyingfile:file:/home/tianzhao/apache/hive-trunk/data/files/json.txt
[junit]Loadingdatatotabledefault.src_json
[junit]POSTHOOK:query:LOADDATALOCALINPATH'/home/tianzhao/apache/hive-trunk/data/files/json.txt'INTOTABLEsrc_json
[junit]POSTHOOK:type:LOAD
[junit]POSTHOOK:Output:default@src_json
[junit]OK
上面显示了创建的表和load的数据。
测试case_sensitivity.q的显示是:
[junit]Runningorg.apache.hadoop.hive.cli.TestCliDriver
[junit]Beginquery:case_sensitivity.q
[junit]Deletedfile:/home/tianzhao/apache/hive-trunk/build/ql/test/data/warehouse/dest1
[junit]diff-a-Ifile:-Ipfile:-Ihdfs:-I/tmp/-Iinvalidscheme:-IlastUpdateTime-IlastAccessTime-I[Oo]wner-ICreateTime-ILastAccessTime-ILocation-Itransient_lastDdlTime-Ilast_modified_-Ijava.lang.RuntimeException-Iatorg-Iatsun-Iatjava-Iatjunit-ICausedby:-ILOCK_QUERYID:-IgrantTime-I[.][.][.][0-9]*more-IUSING'java-cp/home/tianzhao/apache/hive-trunk/build/ql/test/logs/clientpositive/case_sensitivity.q.out/home/tianzhao/apache/hive-trunk/ql/src/test/results/clientpositive/case_sensitivity.q.out
[junit]Donequery:case_sensitivity.q
[junit]CleaningupTestCliDriver
生成的结果在build/ql/test/logs/clientpositive/case_sensitivity.q.out中,通过diff比对它与ql/src/test/results/clientpositive/case_sensitivity.q.out是否相同来判断该UT是否通过。
我们来看如何手工创建src_thrift表,load数据,并执行case_sensitivity.q中的HSQL。
org.apache.hadoop.hive.ql.QTestUtil.createSources()中创建src_thrift表的语句是:
TablesrcThrift=newTable(db.getCurrentDatabase(),"src_thrift");
srcThrift.setInputFormatClass(SequenceFileInputFormat.class.getName());
srcThrift.setOutputFormatClass(SequenceFileOutputFormat.class.getName());
srcThrift.setSerializationLib(ThriftDeserializer.class.getName());
srcThrift.setSerdeParam(Constants.SERIALIZATION_CLASS,Complex.class
.getName());
srcThrift.setSerdeParam(Constants.SERIALIZATION_FORMAT,
TBinaryProtocol.class.getName());
db.createTable(srcThrift);
存储格式是SequenceFile,serde是ThriftDeserializer,serde的两个属性是SERIALIZATION_CLASS和SERIALIZATION_FORMAT。创建表的时候需要知道表的字段,这里没有写。但是在(Constants.SERIALIZATION_CLASS,Complex.class.getName())中已经定义了。
那么看org.apache.hadoop.hive.serde2.thrift.test.Complex的定义:
publicintaint;
publicstaticfinalintAINT=1;
publicStringaString;
publicstaticfinalintASTRING=2;
publicList<Integer>lint;
publicstaticfinalintLINT=3;
publicList<String>lString;
publicstaticfinalintLSTRING=4;
publicList<IntString>lintString;
publicstaticfinalintLINTSTRING=5;
publicMap<String,String>mStringString;
publicstaticfinalintMSTRINGSTRING=6;
IntString的定义:
publicintmyint;
publicstaticfinalintMYINT=1;
publicStringmyString;
publicstaticfinalintMYSTRING=2;
publicintunderscore_int;
publicstaticfinalintUNDERSCORE_INT=3;
可以还原src_thrift的字段:
建表语句是:
hive>createtablesrc_thrift(aintint,aStringstring,lintarray<int>,lStringarray<string>,lintStringarray<struct<myint:int,mString:string,underscore_int:int>>,mStringStringmap<string,string>)
>rowformatserde'org.apache.hadoop.hive.serde2.thrift.ThriftDeserializer'withserdeproperties("serialization.class"="org.apache.hadoop.hive.serde2.thrift.test.Complex","serialization.format"="org.apache.thrift.protocol.TBinaryProtocol")
>storedassequencefile;
OK
Timetaken:0.462seconds
load数据是:
hive>loaddatalocalinpath'src/data/files/complex.seq'intotablesrc_thrift;
Copyingdatafromfile:/home/tianzhao/hive/hive-1.1.2/src/data/files/complex.seq
Loadingdatatotablesrc_thrift
OK
Timetaken:0.286seconds
查看数据是:
hive>select*fromsrc_thriftlimit2;
OK
1712634731record_0[0,0,0]["0","0","0"][{"myint":0,"mystring":"0","underscore_int":0}]{"key_0":"value_0"}
465985200record_1[1,2,3]["10","100","1000"][{"myint":1,"mystring":"1","underscore_int":1}]{"key_1":"value_1"}
Timetaken:0.34seconds
查看表的信息是:
hive>descsrc_thrift;
OK
aintintfromdeserializer
astringstringfromdeserializer
lintarray<int>fromdeserializer
lstringarray<string>fromdeserializer
lintstringarray<org.apache.hadoop.hive.serde2.thrift.test.IntString>fromdeserializer
mstringstringmap<string,string>fromdeserializer
Timetaken:0.085seconds
运行case_sensitivity.q里面的一个语句:(截取了一部分)
hive>fromsrc_thriftSELECTsrc_Thrift.LINT[1],src_thrift.lintstring[0].MYSTRINGwheresrc_thrift.liNT[0]>0;
TotalMapReducejobs=1
LaunchingJob1outof1
Numberofreducetasksissetto0sincethere'snoreduceoperator
StartingJob=job_201105281127_0001,TrackingURL=http://localhost:50030/jobdetails.jsp?jobid=job_201105281127_0001
KillCommand=/home/tianzhao/hive/hadoop-0.20.2/bin/../bin/hadoopjob-Dmapred.job.tracker=localhost:54311-killjob_201105281127_0001
2011-05-2812:04:52,869Stage-1map=0%,reduce=0%
2011-05-2812:04:55,921Stage-1map=100%,reduce=0%
2011-05-2812:04:58,962Stage-1map=100%,reduce=100%
EndedJob=job_201105281127_0001
OK
21
48
627
864
10125
12216
14343
16512
18729
Timetaken:12.147seconds
src表
org.apache.hadoop.hive.ql.QTestUtil.createSources():
LinkedList<String>cols=newLinkedList<String>();
cols.add("key");
cols.add("value");
for(Stringtname:newString[]{"src","src1"}){
db.createTable(tname,cols,null,TextInputFormat.class,
IgnoreKeyTextOutputFormat.class);
}
src、src1两表都是两个columns:
createtablesrc(keystring,valuestring);
load数据:
//loadtheinputdataintothesrctable
fpath=newPath(testFiles,"kv1.txt");
newfpath=newPath(tmppath,"kv1.txt");
fs.copyFromLocalFile(false,true,fpath,newfpath);
//db.loadTable(newfpath,"src",false);
runLoadCmd("LOADDATAINPATH'"+newfpath.toString()+"'INTOTABLEsrc");
loaddatalocalinpath'src/data/files/kv1.txt'intotablesrc;
有了这些东西,我们可以手工测试使用src/ql/src/test/queries下面的各个UT里面的语句。
目录下面的文件hive-0.7.0\src\ql\src\test\queries\clientpositive
(1)add_part_exist.q创建有partition的表,增加partition,显示partition
CREATETABLEadd_part_test(keySTRING,valueSTRING)PARTITIONEDBY(dsSTRING);
SHOWPARTITIONSadd_part_test;
ALTERTABLEadd_part_testADDPARTITION(ds='2010-01-01');
SHOWPARTITIONSadd_part_test;
ALTERTABLEadd_part_testADDIFNOTEXISTSPARTITION(ds='2010-01-01');
SHOWPARTITIONSadd_part_test;
ALTERTABLEadd_part_testADDIFNOTEXISTSPARTITION(ds='2010-01-02');
SHOWPARTITIONSadd_part_test;
ALTERTABLEadd_part_testADDIFNOTEXISTSPARTITION(ds='2010-01-01')PARTITION(ds='2010-01-02')PARTITION(ds='2010-01-03');
SHOWPARTITIONSadd_part_test;
DROPTABLEadd_part_test;
SHOWTABLES;
descextendedadd_part_testpartition(ds='2010-01-03');
需要注意的是:
SHOWTABLES,SHOWPARTITIONS时,PARTITIONS和TABLES都需要加上S,复数形式.
关于'2010-01-02'的一些困惑。
hive>altertableadd_part_testaddifnotexistspartition(ds=2010-01-02);
FAILED:ParseError:line1:61mismatchedinput'-'expecting)inaddpartitionstatement
hive>altertableadd_part_testaddifnotexistspartition(ds='2010-01-02');
OK
Timetaken:0.294seconds
hive>altertableadd_part_testaddifnotexistspartition(ds=2011102);
OK
Timetaken:0.178seconds
hive>showpartitionsadd_part_test;
OK
ds=2010-01-01
ds=2010-01-02
ds=2011102
Timetaken:0.057seconds
(2)alter1.q修改一个表的属性,表的serde以及serde属性。修改column信息。
(3)create_default_prop.q创建表
create_1.q创建表的操作
create_*
DESCRIBEtable3;
DESCRIBEEXTENDEDtable3;
DESCtable3;
DESCEXTENDEDtable3;
是一样的,大小写不区分。
desc0.7版本支持,0.5不支持
hive.map.aggrhive.groupby.skewindata(倾斜)
https://issues.apache.org/jira/browse/HIVE-223