Ask Your Question

How to stop the pipeline after one batch of data processing

asked 2019-12-10 04:26:41 -0600

Shashank Shukla gravatar image

I am trying to create a pipeline, which reads data from a database and writes it onto a file. The pipeline runs perfectly, but after completing one cycle of processing that is reading the database, it starts to read again and writes on to the file. Please someone suggest me how to stop the pipeline after it has finished processing data once.

There are more than 1 million rows in the database so the file should also have same number of rows, but due to pipeline restarting it once if processes all the data, so the file gets duplicate entries.

i need to find a way such that, the pipeline stops after all the data has been written to the file once.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2019-12-10 09:55:13 -0600

iamontheinet gravatar image

updated 2019-12-10 18:44:27 -0600


Both JDBC Multitable Consumer and JDBC Consumer will generate no-more-data when you enable Produce Events on either origin. Once you do that you can attach and configure Pipeline Finisher Executor to automatically stop the pipeline when it receives the no-more-data event.

Cheers, Dash

edit flag offensive delete link more


If I am using Directory Origin instead of JDBC Query consumer, then only 1000 records from the files are getting processed and then the pipeline finisher stops the pipeline. Do You what can I do to precvent this?

Shashank Shukla gravatar imageShashank Shukla ( 2019-12-18 04:59:56 -0600 )edit

Ensure that you have set a precondition in the pipeline finisher of ${record:eventType()=='no-more-data'} and set 'On Record Error' to 'Discard'

metadaddy gravatar imagemetadaddy ( 2019-12-18 13:07:50 -0600 )edit
Login/Signup to Answer

Question Tools

1 follower


Asked: 2019-12-10 04:26:41 -0600

Seen: 48 times

Last updated: Dec 18 '19