Ask Your Question
1

Failed to initialize connection pool - JDBC Redshift

asked 2019-02-13 06:17:34 -0600

larsjaja gravatar image

updated 2019-02-13 11:51:59 -0600

metadaddy gravatar image

While trying to run a query towards redshift. URL, credentials should all be valid - (works on most other JDBC clients):

jdbc:redshift://<url to redshift cluster>:5439/<database>

Tried pretty much all redshift JDBC library versions: latest was this: RedshiftJDBC42-no-awssdk-1.2.10.1009.jar - downloaded from AWS.

Please advise.

com.zaxxer.hikari.pool.PoolInitializationException: Exception during pool initialization: [Amazon](500150) Error setting/closing connection: SocketTimeoutException.
    at com.zaxxer.hikari.pool.HikariPool.initializeConnections(HikariPool.java:581)
    at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:152)
    at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:73)
    at com.streamsets.pipeline.lib.jdbc.JdbcUtil.createDataSourceForRead(JdbcUtil.java:801)
    at com.streamsets.pipeline.stage.executor.jdbc.JdbcQueryExecutorConfig.init(JdbcQueryExecutorConfig.java:71)
    at com.streamsets.pipeline.stage.executor.jdbc.JdbcQueryExecutor.init(JdbcQueryExecutor.java:51)
    at com.streamsets.pipeline.api.base.BaseStage.init(BaseStage.java:48)
    at com.streamsets.pipeline.api.base.configurablestage.DStage.init(DStage.java:36)
    at com.streamsets.datacollector.runner.StageRuntime.lambda$init$0(StageRuntime.java:211)
    at com.streamsets.datacollector.util.LambdaUtil.withClassLoaderInternal(LambdaUtil.java:148)
    at com.streamsets.datacollector.util.LambdaUtil.withClassLoader(LambdaUtil.java:44)
    at com.streamsets.datacollector.runner.StageRuntime.init(StageRuntime.java:209)
    at com.streamsets.datacollector.runner.StagePipe.init(StagePipe.java:123)
    at com.streamsets.datacollector.runner.StagePipe.init(StagePipe.java:47)
    at com.streamsets.datacollector.runner.Pipeline.initPipe(Pipeline.java:408)
    at com.streamsets.datacollector.runner.Pipeline.lambda$init$0(Pipeline.java:397)
    at com.streamsets.datacollector.runner.PipeRunner.forEach(PipeRunner.java:170)
    at com.streamsets.datacollector.runner.Pipeline.init(Pipeline.java:394)
    at com.streamsets.datacollector.execution.runner.common.ProductionPipeline.run(ProductionPipeline.java:95)
    at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunnable.run(ProductionPipelineRunnable.java:75)
    at com.streamsets.datacollector.execution.runner.standalone.StandaloneRunner.start(StandaloneRunner.java:724)
    at com.streamsets.datacollector.execution.AbstractRunner.lambda$scheduleForRetries$0(AbstractRunner.java:349)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.lambda$call$0(SafeScheduledExecutorService.java:226)
    at com.streamsets.datacollector.security.GroupsInScope.execute(GroupsInScope.java:33)
    at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.call(SafeScheduledExecutorService.java:222)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at com.streamsets.datacollector.metrics.MetricSafeScheduledExecutorService$MetricsTask.run(MetricSafeScheduledExecutorService.java:100)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
edit retag flag offensive close merge delete

Comments

1

SocketTimeoutException implies a connectivity issue from the machine where SDC is running to the DB instance. Is there a way to run a command line query against Redshift, and can you try doing so from the same machine where SDC runs?

jeff gravatar imagejeff ( 2019-02-13 11:48:36 -0600 )edit
1

Also, this kind of issue can arise with AWS resources if the security settings on the AWS side are not permitting the type of connection you're attempting to establish.

jeff gravatar imagejeff ( 2019-02-13 11:49:14 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2019-02-14 05:49:11 -0600

larsjaja gravatar image

updated 2019-02-14 09:07:41 -0600

metadaddy gravatar image

Thanks, @jeff - you are right! Turned out our network and security people are very cautious so outgoing traffic was blocked... Now I am in, and dumping data into redshift.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2019-02-13 06:17:34 -0600

Seen: 206 times

Last updated: Feb 14