Ask Your Question
1

How to avoid pipeline failure and move to next batch

asked 2019-04-25 21:33:16 -0500

Asim Bagwan gravatar image

Hello,

Couple of records from batch is failing to write to destination (PostgreSQL) due to unique key violation. I have configured error handling to redirect/pass failed record to different pipeline. Even with this setting my pipeline is failing.

I would like to continue to next batch as I am writing failed records to different location for later re-processing. Could you please help me achieve this.

Regards, Asim

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2019-04-26 12:11:54 -0500

Victor P gravatar image

Hi Asim,

Which version of SDC are you running? Prior to version 3.8.0 there was a bug that caused record-level errors to fail the whole pipeline instead of sending them to the error stream.

Please note that if you use the option "Use multi-row insert" then a single primary key error will still cause the whole insert group to fail, due to underlying behavior of the JDBC connection. Also the target database system might not tell us exactly which record in the group had the problem.

Another thing to check is in the JDBC Producer "General" tab, confirm the stage is configured to send errors to the pipeline error handler rather than "stop the pipeline".

Thanks, Victor

edit flag offensive delete link more

Comments

Thanks Victor! We are using 3.9 version and I have same setting as you mentioned, only difference was that I was not redirecting error records initially, instead I was discarding records on error. Could that be the reason for failure? Thanks!

Asim Bagwan gravatar imageAsim Bagwan ( 2019-04-26 20:08:57 -0500 )edit
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2019-04-25 21:33:16 -0500

Seen: 22 times

Last updated: Apr 25