CDH多租户配置过程中遇到的问题

2018-06-07  本文已影响0人  阿甘骑士
多租户是CDH里面非常重要的一部分,从一开始配置KDC到集成KDC,服务使用过程中都有可能会遇到各种各样的问题;下面我举例说下我当时遇过的问题,希望能帮助到大家
服务启动错误
SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
企业微信截图_15281173461421.png
服务使用问题
[root@bi-master ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: deng_yb@WONHIGH.COM

Valid starting     Expires            Service principal
06/07/18 20:40:52  06/08/18 20:40:52  krbtgt/WONHIGH.COM@WONHIGH.COM
        renew until 06/14/18 20:40:52
[root@bi-master ~]# hive
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.11.0-1.cdh5.11.0.p0.34/jars/hive-common-1.1.0-cdh5.11.0.jar!/hive-log4j.properties
sWARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> show databases;
OK
bi
default
gms
gtp
gtp_data
gtp_dc
gtp_test
gtp_txt
kudu_raw
kudu_test
kudu_vip
Time taken: 3.417 seconds, Fetched: 11 row(s)
Last login: Thu Jun  7 21:48:31 2018 from 10.230.71.245
[root@bi-master ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0
Default principal: deng_yb@WONHIGH.COM

Valid starting     Expires            Service principal
06/07/18 20:40:52  06/08/18 20:40:52  krbtgt/WONHIGH.COM@WONHIGH.COM
        renew until 06/14/18 20:40:52
[root@bi-master ~]# beeline
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Beeline version 1.1.0-cdh5.11.0 by Apache Hive
beeline> !connect jdbc:hive2://localhost:10000/;principal=hive/bi-master@WONHIGH.COM
scan complete in 9ms
Connecting to jdbc:hive2://localhost:10000/;principal=hive/bi-master@WONHIGH.COM
Connected to: Apache Hive (version 1.1.0-cdh5.11.0)
Driver: Hive JDBC (version 1.1.0-cdh5.11.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://localhost:10000/> show databases;
INFO  : Compiling command(queryId=hive_20180607220303_1319b1e8-5ec3-477b-836e-2a279b566ef4): show databases
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:database_name, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20180607220303_1319b1e8-5ec3-477b-836e-2a279b566ef4); Time taken: 1.87 seconds
INFO  : Executing command(queryId=hive_20180607220303_1319b1e8-5ec3-477b-836e-2a279b566ef4): show databases
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=hive_20180607220303_1319b1e8-5ec3-477b-836e-2a279b566ef4); Time taken: 0.835 seconds
INFO  : OK
+----------------+--+
| database_name  |
+----------------+--+
| bi             |
| default        |
+----------------+--+
3 rows selected (4.704 seconds
#随便查个不是自己权限下的表信息
Time taken: 0.076 seconds, Fetched: 114 row(s)
hive> select * from ods_item;
FAILED: SemanticException Unable to determine if hdfs://bi-master:8020/user/hive/warehouse/gtp.db/ods_item is encrypted: org.apache.hadoop.security.AccessControlException: Permission denied: user=deng_yb, access=READ, inode="/user/hive/warehouse/gtp.db/ods_item":hive:hive:drwxrwx--x
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:363)
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:256)
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:168)
        at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:178)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3529)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3512)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:3483)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6588)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getEZForPath(FSNamesystem.java:9282)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getEZForPath(NameNodeRpcServer.java:1635)
        at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getEZForPath(AuthorizationProviderProxyClientProtocol.java:928)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getEZForPath(ClientNamenodeProtocolServerSideTranslatorPB.java:1360)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2220)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2216)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2214)
MapReduce使用问题
Requested user deng_yb is not whitelisted and has id 501,whichis below the minimum allowed 1000

Failing this attempt. Failing the application.
17/09/02 20:05:04 INFO mapreduce.Job: Counters: 0
Job Finished in 6.184 seconds
Diagnostics: Application application_1528344974377_0009 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is hdfs
main : requested yarn user is hdfs
Requested user hdfs is banned
impala问提
(SASL(-4): no mechanism available: No worthy mechs found)
yum install cyrus-sasl-plain  cyrus-sasl-devel  cyrus-sasl-gssapi
kafka使用问题

kafka集成kerberos解决办法可以参考
https://www.jianshu.com/p/dd73b318e743

impala jdbc使用问题

impala jdbc集成kerberos认证可以参考
https://www.jianshu.com/p/62aa4f9e0615

上一篇下一篇

猜你喜欢

热点阅读