Ask Your Question
0

JDBC Query Consumer event generated before completing data process

asked 2020-10-06 12:58:07 -0600

Sujata HS gravatar image

updated 2021-01-20 05:37:44 -0600

Hi All,

I am new to Streamsets and trying to migrate few pipelines from S3 to PostgreSQL as a origin. I am using "JDBC Query Consumer" as a origin to read records from PostgreSQL table, And performing some count check upon "No-more-data" event.

But event is triggered Before completing the data load/process.

For Example : Total records are 641228 & batch count is 1000 then only 640000 records are inserted and "no-more-data" event is triggered. When I changed the batch count to 100 then only 641200 records are processed Similarly when batch count is 10 then only 641220 records are processed

I want all records to be processed/loaded into target table before triggering any event (no-more-data).

Can anyone help what could be reason or any resolution is much appreciated!!

Thanks. image description

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2020-10-10 03:25:36 -0600

abdulsyed gravatar image

Hi. If I understand it correctly you are trying to load data from S3 into Postgres DB is it? If yes, you can use pipeline finisher executor to run once all your records from s3 are empty and has been loaded into Postgres table.

Let me know how you go

edit flag offensive delete link more

Comments

Hi, I am reading data from PostgreSQL using "JDBC Query Consumer" as origin and i am connecting to a JDBC lookup on "no-more-data" to check the target table records count. If count doesn't match pipeline will be failed. But issue here is pipeline is getting failed before loading last batch.

Sujata HS gravatar imageSujata HS ( 2021-01-20 05:33:35 -0600 )edit
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2020-10-06 12:58:07 -0600

Seen: 129 times

Last updated: Jan 20