结合Saved Jobs机制,可以实现重复调度增量更新Job时 --last-value 字段的自动更新赋值,再结合cron或者oozie的定时调度,可实现真正意义的增量更新。
实验:增量job的创建和执行
创建增量更新job:
fulong@FBI006:~/Sqoop/sqoop-1.4.4/bin$ sqoop job --create incretest -- import --connect jdbc:oracle:thin:@192.168.0.138:1521:orcl --username HIVE --password hivefbi --table FBI_SQOOPTEST --hive-import --hive-table INCRETEST --incremental
lastmodified --check-column LASTMODIFIED --last-value '2014/8/27 13:00:00'
14/08/27 17:29:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/08/27 17:29:37 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
14/08/27 17:29:37 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
14/08/27 17:29:37 WARN tool.BaseSqoopTool: It seems that you've specified at least one of following:
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --hive-home
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --hive-overwrite
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --create-hive-table
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --hive-table
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --hive-partition-key
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --hive-partition-value
14/08/27 17:29:37 WARN tool.BaseSqoopTool: --map-column-hive
14/08/27 17:29:37 WARN tool.BaseSqoopTool: Without specifying parameter --hive-import. Please note that
14/08/27 17:29:37 WARN tool.BaseSqoopTool: those arguments will not be used in this session. Either
14/08/27 17:29:37 WARN tool.BaseSqoopTool: specify --hive-import to apply them correctly or remove them
14/08/27 17:29:37 WARN tool.BaseSqoopTool: from command line to remove this warning.
14/08/27 17:29:37 INFO tool.BaseSqoopTool: Please note that --hive-home, --hive-partition-key,
14/08/27 17:29:37 INFO tool.BaseSqoopTool: hive-partition-value and --map-column-hive options are
14/08/27 17:29:37 INFO tool.BaseSqoopTool: are also valid for HCatalog imports and exports
执行Job:
fulong@FBI006:~/Sqoop/sqoop-1.4.4/bin$ ./sqoop job --exec incretest
注意日志中显示的SQL语句:
14/08/27 17:36:23 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(ID), MAX(ID) FROM FBI_SQOOPTEST WHERE ( LASTMODIFIED >= TO_DATE(' 2014/8/27 13:00:00',
'YYYY-MM-DD HH24:MI:SS') AND LASTMODIFIED < TO_DATE(' 2014-08-27 17:36:23', 'YYYY-MM-DD HH24:MI:SS') )
其中, LASTMODIFIED的下界是创建job的语句中指定的,上界是 当前时间 2014-08-27
17:36:23。
验证:
hive> select * from incretest;
OK
2 lion 2014-08-27
Time taken: 0.085 seconds, Fetched: 1 row(s)
然后我向Oracle中插入一条数据:
再执行一次:
fulong@FBI006:~/Sqoop/sqoop-1.4.4/bin$ ./sqoop job --exec incretest
日志中显示的SQL语句:
14/08/27 17:47:19 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(ID), MAX(ID) FROM FBI_SQOOPTEST WHERE ( LASTMODIFIED >= TO_DATE(' 2014-08-27 17:36:23',
'YYYY-MM-DD HH24:MI:SS') AND LASTMODIFIED < TO_DATE(' 2014-08-27 17:47:19', 'YYYY-MM-DD HH24:MI:SS') )
其中,LASTMODIFIED的下界是上一次执行该job的上界,也就是说,Sqoop的“Saved Jobs”机制对于增量导入类Job,自动记录了上一次的执行时间,并自动将该时间赋值给下一次执行的--last-value参数!也就是说,我们只需要通过crontab设定定期执行该job即可,job中的--last-value将被“Saved
Jobs”机制自动更新以实现真正意义的增量导入。
以上Oracle表中新增的数据被成功插入Hive表中。
再次向oracle表中新增一条数据,再次执行该job,情况依旧,日志中显示上一次的上界自动成为本次导入的下界:
14/08/27 17:59:34 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(ID), MAX(ID) FROM FBI_SQOOPTEST WHERE ( LASTMODIFIED >= TO_DATE('2014-08-27 17:47:19', 'YYYY-MM-DD
HH24:MI:SS') AND LASTMODIFIED < TO_DATE('2014-08-27 17:59:34', 'YYYY-MM-DD HH24:MI:SS') )