Ask Your Question

HadoopFS as destination throws error while running pipeline

asked 2019-03-27 16:30:24 -0500

renukakallepalli gravatar image


I created a pipeline to push data from oracle to hive

JDBC Query consumer -> hive metadata -> hadoop FS & Hive metastore

I am getting the below error while running the pipeline but it passes the validation

Pipeline Status: RUNNING_ERROR: com.streamsets.pipeline.api.StageException: HADOOPFS_13 - Error while writing to HDFS: com.streamsets.pipeline.api.StageException: HADOOPFS_14 - Cannot write record: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation( at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation( at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation( at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo( at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo( at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo( at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo( at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod( at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ at org.apache.hadoop.ipc.RPC$ at org.apache.hadoop.ipc.Server$Handler$ at org.apache.hadoop.ipc.Server$Handler$ at Method) at at at org.apache.hadoop.ipc.Server$

Did anyone face this issue , can anyone throw some light on this issue?


edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2019-04-04 04:50:36 -0500

rishi gravatar image

updated 2019-05-24 01:46:17 -0500

metadaddy gravatar image

Do you have HDFS HA enabled? Seems like request reached to Standby Namenode instead Active one.

get dfs.nameservices and try to connect to hdfs using namespace as follows, it may help you i.e hdfs://<ClusterName>-ns/<hdfs_path>

Also make sure to store hdfs-site.xml, core-site.xml in SDC. -Store the files or a symlink to the files in the Data Collector resources directory. -In the Hadoop FS destination, specify the location of the files.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2019-03-27 16:30:24 -0500

Seen: 1,974 times

Last updated: May 24 '19