Ask Your Question

How to run multiple pipelines to load delta in postgres

asked 2019-07-26 02:54:13 -0600

AC gravatar image

updated 2019-08-02 11:25:23 -0600

metadaddy gravatar image

My scenario is that I would like to load delta from one schema in postgres to another schema in the same postgres db. I don't want to do a full load. Instead, an incremental load with update or delete.

The solution I found is:

  • Origin : JDBC Query Consumer : select * from postgreg_schema01.view_a
  • Destination : JDBC Query Executor : insert into postgres_schema02.table_a on conflict do update set... (upsert)

But I have to create 20 different pipeline for 20 different views. Is there any way to create one pipeline instead of 20 different pipeline to solve this scenario?

edit retag flag offensive close merge delete


Can you post some details of what you tried? What did your pipeline look like? What query did you use? What was the result - did you see an error?

metadaddy gravatar imagemetadaddy ( 2019-07-29 14:28:37 -0600 )edit

In my scenario , after I posted question in here , I applied a solution. It is ; Origin : JDBC Query Consuemer : select * from postgreg_schema01.view_a Destination : JDBC Query : insert into pgostgre_schema02.table_a on conflict do update set... (upsert) But nearly I have to create 20 different pi

AC gravatar imageAC ( 2019-07-30 07:12:16 -0600 )edit

Can you edit your question to include that information? it looks like you ran out of space.

metadaddy gravatar imagemetadaddy ( 2019-07-31 09:34:05 -0600 )edit

@metadaddy does streamsets supports multiple running instances of the same pipeline? Example: If I have a streaming pipeline, can I run 2 concurrent instances of it using different parameters?

Satendra Tiwari gravatar imageSatendra Tiwari ( 2019-08-02 05:04:29 -0600 )edit

I tidied up your question and posted an answer. Thanks for giving some clarification!

metadaddy gravatar imagemetadaddy ( 2019-08-02 11:25:53 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2019-08-02 11:22:11 -0600

metadaddy gravatar image

You can parameterize pipelines using runtime parameters.

A single instance of Data Collector can only run one instance of a given pipeline, so to run multiple instances you would need to install multiple instances of Data Collector, or use StreamSets Control Hub, which allows you to create jobs that associate parameters with a pipeline, and run multiple jobs across one or more Data Collector instances.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2019-07-26 02:54:13 -0600

Seen: 264 times

Last updated: Aug 02 '19