Error - Cannot make connection with default hive database [closed]

asked 2018-02-26 04:45:48 -0500

Shruthi gravatar image

updated 2018-03-14 18:29:19 -0500

metadaddy gravatar image

I am trying to write data hive. My origin is JDBC Query consumer, processor HIve Metadata and destinations hadoop and Hive metastore. But i get the following error

HIVE_22 - Cannot make connection with default hive database starting with URL: jdbc:hive2://localhost:10000/default. Reason:HIVE_22 - Cannot make connection with default hive database starting with URL: jdbc:hive2://localhost:10000/default. Reason:null

I followed this document:- https://github.com/streamsets/tutoria...

Given below is my log file,

2018-03-01 10:21:17,960 testhive/testhive607a84b8-90a8-4d52-bbec-453f91416049   ERROR   Error Connecting to Hive Database with URL jdbc:hive2://localhost:10000/emp_new HiveConfigBean  *admin  0   preview-pool-1-thread-1
com.streamsets.pipeline.api.StageException: HIVE_22 - Cannot make connection with default hive database starting with URL: jdbc:hive2://localhost:10000/emp_new. Reason:null

    at com.streamsets.pipeline.stage.lib.hive.HiveMetastoreUtil.getHiveConnection(HiveMetastoreUtil.java:854)
    at com.streamsets.pipeline.stage.lib.hive.HiveConfigBean.getHiveConnection(HiveConfigBean.java:139)
    at com.streamsets.pipeline.stage.lib.hive.HiveConfigBean.init(HiveConfigBean.java:258)
    at com.streamsets.pipeline.stage.destination.hive.HMSTargetConfigBean.init(HMSTargetConfigBean.java:170)
    at com.streamsets.pipeline.stage.destination.hive.HiveMetastoreTarget.init(HiveMetastoreTarget.java:81)
    at com.streamsets.pipeline.api.base.BaseStage.init(BaseStage.java:48)
    at com.streamsets.pipeline.configurablestage.DStage.init(DStage.java:36)
    at com.streamsets.datacollector.runner.StageRuntime.lambda$init$0(StageRuntime.java:176)
    at com.streamsets.datacollector.util.LambdaUtil.withClassLoaderInternal(LambdaUtil.java:148)
    at com.streamsets.datacollector.util.LambdaUtil.withClassLoader(LambdaUtil.java:44)
    at com.streamsets.datacollector.runner.StageRuntime.init(StageRuntime.java:174)
    at com.streamsets.datacollector.runner.StagePipe.init(StagePipe.java:100)
    at com.streamsets.datacollector.runner.StagePipe.init(StagePipe.java:48)
    at com.streamsets.datacollector.runner.Pipeline.initPipe(Pipeline.java:386)
    at com.streamsets.datacollector.runner.Pipeline.lambda$init$0(Pipeline.java:376)
    at com.streamsets.datacollector.runner.PipeRunner.forEach(PipeRunner.java:162)
    at com.streamsets.datacollector.runner.Pipeline.init(Pipeline.java:374)
    at com.streamsets.datacollector.runner.preview.PreviewPipeline.run(PreviewPipeline.java:49)
        at com.streamsets.datacollector.execution.preview.sync.SyncPreviewer.start(SyncPreviewer.java:207)
    at com.streamsets.datacollector.execution.preview.async.AsyncPreviewer.lambda$start$0(AsyncPreviewer.java:94)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.lambda$call$0(SafeScheduledExecutorService.java:227)
    at com.streamsets.datacollector.security.GroupsInScope.execute(GroupsInScope.java:33)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.call(SafeScheduledExecutorService.java:223)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.lambda$call$0(SafeScheduledExecutorService.java:227)
    at com.streamsets.datacollector.security.GroupsInScope.execute(GroupsInScope.java:33)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.call(SafeScheduledExecutorService.java:223)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at com.streamsets.datacollector.metrics.MetricSafeScheduledExecutorService$MetricsTask.run(MetricSafeScheduledExecutorService.java:100)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Can anyone please help

edit retag flag offensive reopen merge delete

Closed for the following reason duplicate question by Shruthi
close date 2018-07-03 05:14:44.744836

Comments

Is there any more detail in the sdc.log file?

metadaddy gravatar imagemetadaddy ( 2018-02-28 15:08:20 -0500 )edit

I have added sdc.log file details in my question @metadaddy

Shruthi gravatar imageShruthi ( 2018-02-28 23:03:38 -0500 )edit

Are SDC and Hive on the same machine? Is either/both of them in a VM?

metadaddy gravatar imagemetadaddy ( 2018-03-14 18:30:04 -0500 )edit

SDC and Hive are in the same machine. None of them are in VM. I am trying in local

Shruthi gravatar imageShruthi ( 2018-03-14 23:46:54 -0500 )edit