通过在命令行上使用以下命令,我们可以获得oozie作业的作业详细信息,尽管该作业仍在进行中。
export OOZIE_URL=http://..../oozie
oozie job -info 0177204-172227110941438-oozie-oozi-Woozie工作流正在运行,在最后一个shell操作中,我试图将作业信息详细信息捕获到一个文件中。
job.sh
------------
job_id=${1}
export OOZIE_URL=http://..../oozie
oozie job -info job_id >> /tmp/job_id.txt但是上面的命令不起作用。有没有一种方法,我们如何在正在运行的oozie作业中获得作业信息细节,并将其存储在文本文件中。
发布于 2018-07-06 13:33:18
我希望你在一个集群上运行你的命令,你可以从这个集群中获取信息/终止/启动oozie作业。如果是这种情况,那么您的命令应该可以很好地工作。我尝试了下面的命令,我能够看到结果。
cluster:~$ oozie job -info job-id -oozie http://gateway-url/oozie/ > a.txt
cluster:~$ cat a.txt输出结果为:
Job ID : job-id
------------------------------------------------------------------------------------------------------------------------------------
Workflow Name : workflow-name
App Path : hdfs://path/to/workflow.xml
Status : RUNNING
Run : 0
User : user-id
Group : -
Created : 2018-07-05 22:30 GMT
Started : 2018-07-05 22:30 GMT
Last Modified : 2018-07-06 05:17 GMT
Ended : -
CoordAction ID: coordinator-id@410
Actions
------------------------------------------------------------------------------------------------------------------------------------
ID Status Ext ID Ext Status Err Code
------------------------------------------------------------------------------------------------------------------------------------
job-id@:start: OK - OK -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action1 OK - action2 -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action2 OK - OK -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action3 OK - action4 -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action4 OK MR_job_id1 SUCCEEDED -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action5 OK MR_job_id2 SUCCEEDED -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action6 OK - action7 -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action7 OK MR_job_id3 SUCCEEDED -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action8 OK MR_job_id4 SUCCEEDED -
------------------------------------------------------------------------------------------------------------------------------------
job-id@action9 RUNNING MR_job_id5 RUNNING -
------------------------------------------------------------------------------------------------------------------------------------ 发布于 2018-01-30 04:09:06
如果您使用hdfs:///tmp/job_id.txt
oozie job,hadoop fs job /tmp //tmp_id.txt/tmp#和hadoop fs -cat oozie job可以收集更多的调试数据。然后检查Oozie生成的url是否正确(例如,通过将其粘贴到浏览器)。如果您的
${wf:id()}.https://stackoverflow.com/questions/48220856
复制相似问题