User sdc does not have privileges for DESCDATABASE
Hello
We are trying out the StreamSets in our hadoop cluster and are facing some difficulty with access.
Details on cluster
- CDH 5.15.x Kerberized/LDAP
- Sentry enabled
- JKS/JTS set up (key & trust stores)
- SSL Enabled
- Encryption at rest enabled
We added the StreamSets parcel through Cloudera Manager and have it up & running. We are able to read from (standalone Hadoop FS) & write to HDFS.
- sdc has been added to hive proxy user list
- hadoop.kms.proxyuser.sdc.users/
- hadoop.kms.proxyuser.sdc.host have been set in KMS config
- sdc added to sentry admins & valid connected users list
- sdc added to decrypt data from encryption zones
Now, sdc is not in any LDAP groups. So, we are not able to add sdc to a sentry role. should SDC be a ldap user?
Since, sdc is another service similar to (say) spark/impala in our cluster and We do not have spark entries in sentry, but spark works fine with hive tables. What are we missing to let sdc access our hive data?
But, we are not able to read or write into any of the Hive tables yet. Here is the error.
com.streamsets.pipeline.api.base.OnRecordErrorException: HIVE_23 - TBL Properties 'com.streamsets.pipeline.stage.lib.hive.exceptions.HiveStageCheckedException: HIVE_20 - Error executing SQL: DESCRIBE DATABASE `db_sdc`, Reason:Error while compiling statement: FAILED: SemanticException No valid privileges
User sdc does not have privileges for DESCDATABASE
The required privileges: Server=server1->Db=db_sdc->action=select;Server=server1->Db=db_sdc->action=insert;' Mismatch: Actual: {} , Expected: {}
at com.streamsets.pipeline.stage.processor.hive.HiveMetadataProcessor.process(HiveMetadataProcessor.java:589)
at com.streamsets.pipeline.api.base.RecordProcessor.process(RecordProcessor.java:52)
at com.streamsets.pipeline.api.base.configurablestage.DProcessor.process(DProcessor.java:35)
at com.streamsets.datacollector.runner.StageRuntime.lambda$execute$2(StageRuntime.java:286)
at com.streamsets.datacollector.runner.StageRuntime.execute(StageRuntime.java:235)
at com.streamsets.datacollector.runner.StageRuntime.execute(StageRuntime.java:298)
at com.streamsets.datacollector.runner.StagePipe.process(StagePipe.java:219)
at com.streamsets.datacollector.runner.preview.PreviewPipelineRunner.lambda$runSourceLessBatch$0(PreviewPipelineRunner.java:348)
at com.streamsets.datacollector.runner.PipeRunner.acceptConsumer(PipeRunner.java:221)
at com.streamsets.datacollector.runner.PipeRunner.executeBatch(PipeRunner.java:142)
at com.streamsets.datacollector.runner.preview.PreviewPipelineRunner.runSourceLessBatch(PreviewPipelineRunner.java:344)
at com.streamsets.datacollector.runner.preview.PreviewPipelineRunner.processBatch(PreviewPipelineRunner.java:269)
at com.streamsets.datacollector.runner.StageRuntime$3.run(StageRuntime.java:370)
at java.security.AccessController.doPrivileged(Native Method)
at com.streamsets.datacollector.runner.StageRuntime.processBatch(StageRuntime.java:366)
at com.streamsets.datacollector.runner.StageContext.processBatch(StageContext.java:270)
at com.streamsets.pipeline.lib.dirspooler.SpoolDirRunnable.produce(SpoolDirRunnable.java:303)
at com.streamsets.pipeline.lib.dirspooler.SpoolDirRunnable.run(SpoolDirRunnable.java:142)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at com.streamsets.pipeline.lib ...