设为首页 收藏本站
查看: 698|回复: 0

[经验分享] Using Sqoop 1.4.6 With Hadoop 2.7.4-candon123

[复制链接]
发表于 2018-10-28 14:54:39 | 显示全部楼层 |阅读模式
[hadoop@hdp01 ~]$ sqoop import --connect jdbc:postgresql://192.168.120.93:5432/rhndb --table rhnpackagefile --username rhnuser -P --fields-terminated-by ',' --hive-import --hive-database hivedb --columns package_id,capability_id,device,inode,file_mode,username,groupname,rdev,file_size,mtime,checksum_id,linkto,flags,verifyflags,lang,created,modified --split-by modified -m 4  2017-12-28 11:24:46,666 [myid:] - INFO  [main:Sqoop@92] - Running Sqoop version: 1.4.6
  Enter password:
  2017-12-28 11:24:48,891 [myid:] - INFO  [main:SqlManager@98] - Using default fetchSize of 1000
  2017-12-28 11:24:48,894 [myid:] - INFO  [main:CodeGenTool@92] - Beginning code generation
  2017-12-28 11:24:49,091 [myid:] - INFO  [main:SqlManager@757] - Executing SQL statement: SELECT t.* FROM "rhnpackagefile" AS t LIMIT 1
  2017-12-28 11:24:49,127 [myid:] - INFO  [main:CompilationManager@94] - HADOOP_MAPRED_HOME is /u01/hadoop
  Note: /tmp/sqoop-hadoop/compile/ca09f6bb133fa32808220902aedc0437/rhnpackagefile.java uses or overrides a deprecated API.
  Note: Recompile with -Xlint:deprecation for details.
  2017-12-28 11:24:50,481 [myid:] - INFO  [main:CompilationManager@330] - Writing jar file: /tmp/sqoop-hadoop/compile/ca09f6bb133fa32808220902aedc0437/rhnpackagefile.jar
  2017-12-28 11:24:50,493 [myid:] - WARN  [main:PostgresqlManager@119] - It looks like you are importing from postgresql.
  2017-12-28 11:24:50,493 [myid:] - WARN  [main:PostgresqlManager@120] - This transfer can be faster! Use the --direct
  2017-12-28 11:24:50,494 [myid:] - WARN  [main:PostgresqlManager@121] - option to exercise a postgresql-specific fast path.
  2017-12-28 11:24:50,495 [myid:] - INFO  [main:ImportJobBase@235] - Beginning import of rhnpackagefile
  2017-12-28 11:24:50,496 [myid:] - INFO  [main:Configuration@1019] - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
  2017-12-28 11:24:50,634 [myid:] - INFO  [main:Configuration@1019] - mapred.jar is deprecated. Instead, use mapreduce.job.jar
  2017-12-28 11:24:51,160 [myid:] - INFO  [main:Configuration@1019] - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
  2017-12-28 11:24:51,506 [myid:] - INFO  [main:TimelineClientImpl@123] - Timeline service address: http://hdp01:8188/ws/v1/timeline/
  2017-12-28 11:24:51,696 [myid:] - INFO  [main:AHSProxy@42] - Connecting to Application History server at hdp01.thinkjoy.tt/192.168.120.96:10201
  2017-12-28 11:24:53,801 [myid:] - INFO  [main:DBInputFormat@192] - Using read commited transaction isolation
  2017-12-28 11:24:53,805 [myid:] - INFO  [main:Configuration@1019] - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
  2017-12-28 11:24:53,805 [myid:] - INFO  [main:DataDrivenDBInputFormat@147] - BoundingValsQuery: SELECT MIN("modified"), MAX("modified") FROM "rhnpackagefile"
  2017-12-28 11:25:14,854 [myid:] - WARN  [main:TextSplitter@64] - Generating splits for a textual index column.
  2017-12-28 11:25:14,854 [myid:] - WARN  [main:TextSplitter@65] - If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.
  2017-12-28 11:25:14,854 [myid:] - WARN  [main:TextSplitter@67] - You are strongly encouraged to choose an integral split column.
  2017-12-28 11:25:14,903 [myid:] - INFO  [main:JobSubmitter@396] - number of splits:6
  2017-12-28 11:25:14,997 [myid:] - INFO  [main:JobSubmitter@479] - Submitting tokens for job: job_1514358672274_0009
  2017-12-28 11:25:15,453 [myid:] - INFO  [main:YarnClientImpl@236] - Submitted application application_1514358672274_0009
  2017-12-28 11:25:15,485 [myid:] - INFO  [main:Job@1289] - The url to track the job: http://hdp01:8088/proxy/application_1514358672274_0009/
  2017-12-28 11:25:15,486 [myid:] - INFO  [main:Job@1334] - Running job: job_1514358672274_0009
  2017-12-28 11:25:24,763 [myid:] - INFO  [main:Job@1355] - Job job_1514358672274_0009 running in uber mode : false
  2017-12-28 11:25:24,764 [myid:] - INFO  [main:Job@1362] -  map 0% reduce 0%
  2017-12-28 11:26:00,465 [myid:] - INFO  [main:Job@1362] -  map 17% reduce 0%
  2017-12-28 11:26:01,625 [myid:] - INFO  [main:Job@1362] -  map 50% reduce 0%
  2017-12-28 11:26:03,643 [myid:] - INFO  [main:Job@1362] -  map 83% reduce 0%
  2017-12-28 11:34:22,028 [myid:] - INFO  [main:Job@1362] -  map 100% reduce 0%
  2017-12-28 11:34:22,035 [myid:] - INFO  [main:Job@1373] - Job job_1514358672274_0009 completed successfully
  2017-12-28 11:34:22,162 [myid:] - INFO  [main:Job@1380] - Counters: 31
  File System Counters
  FILE: Number of bytes read=0
  FILE: Number of bytes written=860052
  FILE: Number of read operations=0
  FILE: Number of large read operations=0
  FILE: Number of write operations=0
  HDFS: Number of bytes read=913
  HDFS: Number of bytes written=3985558014
  HDFS: Number of read operations=24
  HDFS: Number of large read operations=0
  HDFS: Number of write operations=12
  Job Counters
  Killed map tasks=1
  Launched map tasks=7
  Other local map tasks=7
  Total time spent by all maps in occupied slots (ms)=1208611
  Total time spent by all reduces in occupied slots (ms)=0
  Total time spent by all map tasks (ms)=1208611
  Total vcore-seconds taken by all map tasks=1208611
  Total megabyte-seconds taken by all map tasks=4331661824
  Map-Reduce Framework
  Map input records=18680041
  Map output records=18680041
  Input split bytes=913
  Spilled Records=0
  Failed Shuffles=0
  Merged Map outputs=0
  GC time elapsed (ms)=4453
  CPU time spent (ms)=180780
  Physical memory (bytes) snapshot=1957969920
  Virtual memory (bytes) snapshot=30116270080
  Total committed heap usage (bytes)=1611661312
  File Input Format Counters
  Bytes Read=0
  File Output Format Counters
  Bytes Written=3985558014
  2017-12-28 11:34:22,170 [myid:] - INFO  [main:ImportJobBase@184] - Transferred 3.7118 GB in 571.0001 seconds (6.6566 MB/sec)
  2017-12-28 11:34:22,174 [myid:] - INFO  [main:ImportJobBase@186] - Retrieved 18680041 records.
  2017-12-28 11:34:22,215 [myid:] - INFO  [main:SqlManager@757] - Executing SQL statement: SELECT t.* FROM "rhnpackagefile" AS t LIMIT 1
  2017-12-28 11:34:22,245 [myid:] - INFO  [main:HiveImport@194] - Loading uploaded data into Hive
  2017-12-28 11:34:28,609 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] -
  2017-12-28 11:34:28,609 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - Logging initialized using configuration in jar:file:/u01/hive/lib/hive-common-2.3.2.jar!/hive-log4j2.properties Async: true
  2017-12-28 11:34:31,619 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - OK
  2017-12-28 11:34:31,622 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - Time taken: 1.666 seconds
  2017-12-28 11:34:32,026 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - Loading data to table hivedb.rhnpackagefile
  2017-12-28 11:36:14,783 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - OK
  2017-12-28 11:36:14,908 [myid:] - INFO  [Thread-98:LoggingAsyncSink$LoggingThread@85] - Time taken: 103.285 seconds
  2017-12-28 11:36:15,363 [myid:] - INFO  [main:HiveImport@242] - Hive import complete.
  2017-12-28 11:36:15,372 [myid:] - INFO  [main:HiveImport@278] - Export directory is contains the _SUCCESS file only, removing the directory.


运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-627614-1-1.html 上篇帖子: Determine Hadoop Memory Configuration Settings-candon123 下篇帖子: 【最全】42本Hadoop、大数据、人工智能学习必读书籍
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表