Use JDBC Producer enable 'Use Multi-Row Operation' if throw error the pipeline hold running_error until restart sdc

asked 2020-07-17 01:39:34 -0500

gavin gravatar image

SDC Version :3.16

i have config pipeline that read from mysqlblog and write to postgresql

srouce :MySQL Binary Log

desc:jdbc producer

jdbc config

"configuration" : [ { "name" : "schema", "value" : "${record:attribute('Database')}" }, { "name" : "tableNameTemplate", "value" : "${record:attribute('Table')}" }, { "name" : "columnNames", "value" : [ ] }, { "name" : "encloseTableName", "value" : false }, { "name" : "changeLogFormat", "value" : "NONE" }, { "name" : "defaultOperation", "value" : "INSERT" }, { "name" : "unsupportedAction", "value" : "DISCARD" }, { "name" : "useMultiRowInsert", "value" : false }, { "name" : "maxPrepStmtParameters", "value" : -1 }, { "name" : "rollbackOnError", "value" : false }, { "name" : "hikariConfigBean.connectionString", "value" : "jdbc:postgresql://x.x.x.x:5432/wms_gzrz" }, { "name" : "hikariConfigBean.useCredentials", "value" : true }, { "name" : "hikariConfigBean.username", "value" : "" }, { "name" : "hikariConfigBean.password", "value" : "" }, { "name" : "hikariConfigBean.driverProperties", "value" : [ { } ] }, { "name" : "hikariConfigBean.maximumPoolSize", "value" : 50 }, { "name" : "hikariConfigBean.minIdle", "value" : 20 }, { "name" : "hikariConfigBean.connectionTimeout", "value" : "${30 * SECONDS}" }, { "name" : "hikariConfigBean.idleTimeout", "value" : "${10 * MINUTES}" }, { "name" : "hikariConfigBean.maxLifetime", "value" : "${30 * MINUTES}" }, { "name" : "hikariConfigBean.initialQuery", "value" : null }, { "name" : "hikariConfigBean.transactionIsolation", "value" : "DEFAULT" }, { "name" : "customDataSqlStateCodes", "value" : [ ] }, { "name" : "stageOnRecordError", "value" : "TO_ERROR" }, { "name" : "stageRequiredFields", "value" : [ ] }, { "name" : "stageRecordPreconditions", "value" : [ ] }, { "name" : "hikariConfigBean.driverClassName", "value" : null }, { "name" : "hikariConfigBean.connectionTestQuery", "value" : null } ],

when enable jdbc producer 'Use Multi-Row Operation' , if throw pk error ,cause by another issue), pipeline hold on RUNNING_ERROR status, click [force stop] noghing change, i try to wait for 1 hours .

the running_error is temp status, running_error the next status is error. is something wrong in this ?

how to recover the pipeline status without restart sdc service .

logs:

throw pk errors

2020-07-16 18:19:11,983 data sync - gzrz-wms-sum-to-1-wh/datasync6dba2d3c-c1b8-4e2d-adb3-fdfb4913d4a4 ERROR Error while processing batch of records together: SQLState: 23505 JdbcMultiRowRecordWriter *admin 0 ProductionPipelineRunnable-datasync6dba2d3c-c1b8-4e2d-adb3-fdfb4913d4a4-data sync - gzrz-wms-sum-to-1-wh Error Code: 0 Message: ERROR: duplicate key value violates unique constraint "shipping_order_detail_pkey" (seg1 172.16.5.97:40001 pid=3336) Detail: Key (wh_id, shipping_order_id, line_id)=(wh4, BJ200716000171, 40) already exists. 2020-07-16 18:19:11,984 data sync - gzrz-wms-sum-to-1-wh/datasync6dba2d3c-c1b8-4e2d-adb3-fdfb4913d4a4 ERROR SQLState: 25P02 JdbcBaseRecordWriter *admin 0 ProductionPipelineRunnable-datasync6dba2d3c-c1b8-4e2d-adb3-fdfb4913d4a4-data sync - gzrz-wms-sum-to-1-wh Error Code: 0 Message: ERROR: current transaction is aborted, commands ignored until end of transaction block Cause: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "shipping_order_detail_pkey" (seg1 172.16.5.97:40001 pid=3336) Detail: Key (wh_id, shipping_order_id, line_id)=(wh4, BJ200716000171, 40) already exists.

org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306) at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441) at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365) at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155) at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132) at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61) at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java) at com.streamsets.pipeline.lib.jdbc.JdbcMultiRowRecordWriter.processBatch(JdbcMultiRowRecordWriter.java:312) at com.streamsets.pipeline.lib.jdbc.JdbcMultiRowRecordWriter.processQueue(JdbcMultiRowRecordWriter.java:294) at com.streamsets.pipeline.lib.jdbc.JdbcMultiRowRecordWriter.writeBatch(JdbcMultiRowRecordWriter.java:156) at com.streamsets.pipeline.lib.jdbc.JdbcUtil.write(JdbcUtil.java:1173) at com.streamsets.pipeline.lib.jdbc.JdbcUtil.write(JdbcUtil.java:1095) at com.streamsets ...

(more)
edit retag flag offensive close merge delete