Hadoop Oozie学习笔记 org.apache.oozie.service.Authoriz
之前Oozie都是伪分布式测试,获取的都是本地文件.现在开始在分布式环境下测试,从HDFS中获取文件(所以你的APP也要传入到相应的HDFS路径中).我这里采用的例子是$OOZIE_HOME/examples/apps/map-reduce.其中对job.properties做如下设置:
nameNode=hdfs://localhost:9000
- jobTracker=localhost:9001
- queueName=default
- examplesexamplesRoot=examples
- oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce
- outputDir=map-reduce
之后在控制台通过如下命令提交app job:
$OOZIE_HOME/bin/oozie job -oozie http://localhost:11000/oozie -config /home/guoyun/Hadoop/oozie-2.3.2-cdh3u2/examples/apps/map-reduce/job.properties -run
但碰到以下异常:
- org.apache.oozie.service.AuthorizationException: E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun]
- at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:318)
- at org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:171)
- at org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:89)
- at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
- at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:281)
- at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
- at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
- at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
- at org.apache.oozie.servlet.AuthFilter$2.doFilter(AuthFilter.java:123)
- at com.cloudera.alfredo.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:371)
- at org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:128)
- at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
- at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
- at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
- at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
- at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
- at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
- at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
- at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
- at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
- at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
- at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
- at java.lang.Thread.run(Thread.java:662)
- Caused by: org.apache.oozie.service.HadoopAccessorException: E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun]
- at org.apache.oozie.service.KerberosHadoopAccessorService.createFileSystem(KerberosHadoopAccessorService.java:211)
- at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:283)
- ... 22 more
- Caused by: org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun
- at org.apache.hadoop.ipc.Client.call(Client.java:1107)
- at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
- at $Proxy22.getProtocolVersion(Unknown Source)
- at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
- at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
- at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
- at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
- at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
- at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
- at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
- at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
- at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
- at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
- at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
- at org.apache.oozie.service.KerberosHadoopAccessorService$3.run(KerberosHadoopAccessorService.java:203)
- at org.apache.oozie.service.KerberosHadoopAccessorService$3.run(KerberosHadoopAccessorService.java:194)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:396)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
- at org.apache.oozie.service.KerberosHadoopAccessorService.createFileSystem(KerberosHadoopAccessorService.java:194)
- ... 23 more
这是个权限异常,通过google也找到了较为详细的解决方案.原理我只能初步做个大概的猜测,具体细节还需日后渐渐摸索.由于需要从HDFS中访问你所提交的APP,那肯定需要对这个APP文件有权限.所以需要进行相关配置.我这里用户名(你提交app job所用的用户名)是guoyun,你可以通过whoami来获取.然后获取你当前用户名所在的用户组,通过groups userName获取.获取到用户名和用户组之后需要在$HADOOP_HOME/conf/core-site.xml中进行设置,我的设置如下:
- <!-- for oozie,add by guoyun,2011-11-13 -->
- <property>
- <name>hadoop.proxyuser.<SPAN style="COLOR: #ff0000"><STRONG>guoyun</STRONG></SPAN>.hosts</name>
- <value>localhost</value>
- </property>
- <property>
- <name>hadoop.proxyuser.<SPAN style="COLOR: #ff0000"><STRONG>guoyun</STRONG></SPAN>.groups</name>
- <value>guoyun</value>
- </property>
其中中hadoop.proxyuser.guoyun.hosts和hadoop.proxyuser.guoyun.groups中的guoyun要改成你的用户名.
修改好cote-site.xml后需要重启你的hadoop.之后再提交job,OK!该异常解决