sqoop导入数据到hive中,数据不一致
sqoop import "-Dorg.apache.sqoop.splitter.allow_text_splitter=true" -D mapreduce.job.queuename=hqgf --connect jdbc:mysql://xxx.xx.x.xxx:1234/hqmart_nc --username bitest --password 123456 --table rent_contract_baseinfo --hive-import --create-hive-table --hive-table hqgf_ods_nc.rent_contract_baseinfo --split-by contractcode --hive-drop-import-delims --fields-terminated-by '\001' --lines-terminated-by '\n' -m 1 --delete-target-dir
--hive-drop-import-delims --fields-terminated-by '\001' --lines-terminated-by '\n'
--hive-delims-replacement 导入到hive时用自定义的字符替换掉 \n, \r, and \01
防止换行