30-Day Free Trial: It’s Never Been Easier To Get Started With StreamSets
I'm having trouble trying to open a pipeline in StreamSets.All software is functional, like settings, etc.But, when trying to open a pipeline to edit, for example, it only shows the message "loading". Log messages do not indicate the likely problem.
how to create the pipeline for the xml data where it is taken from sftp in the streamsets we have getting only 1 record from sftp in the preview data
Hello, I am creating the most simplest pipeline: a query (JDBC Query Consumer) whose output is written to files - Local FS. I have broken the files to 200k records each but this job never stops running and keeps writing file after file of what I assume is duplicate data. I cannot for the life of me figure out what I did wrong. Any insight would be greatly appreciated!I have created complex pipelines in the past, so this is strange.
Hi Team,I generate Credential ID and Token via StreamSets UI → Manage → API Credential, and with the generated ID and Token I can run Curl command, returned status is “HTTP/1.1 200 OK”, also return me a json format that show my organization id, email, extra.. However, When I want to connect with below code snippet (using same credential ID and Token), it returns me error of 403. paste code and error below:>>> from streamsets.sdk import ControlHub>>> sch = ControlHub(credential_id='absd_myid', token="abcd_mytoken.") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/opt/module/sfsdcvenv/lib/python3.8/site-packages/streamsets/sdk/sch.py", line 141, in __init__ self.api_client = sch_api.ApiClient( File "/opt/module/sfsdcvenv/lib/python3.8/site-packages/streamsets/sdk/sch_api.py", line 96, in __init__ raise ValueError('Encountered error while decoding auth token: {}'.format(e))ValueError: Encountered error while decoding aut
Hi team,we have a pipeline, which configured a stop event, it let a sql statement to run once pipeline finish processing data, however we found, when the pipeline failed due to some reason, this “Stop Event” still run, which is not expected for us. Can you please let me know if some place can be configured, which let the Pipeline "Stop Event" not run when pipeline failed processing data? I actually found if the pipelien “Start Event” failed, the pipeline will not run which is expected, however the “Stop Event” always run even pipeline failed.
The HTTP Client in my pipeline is not processing all of the input records that it gets. For eg: Input to HTTP client is 1430 records, but the output records processed in the same client are 1360 records only, with 0 error records. Not sure if I am missing any configuration to be added so that I can balance the I/O records, and send the error records to the error stage.
Hello There I am trying to solve a very specific usecase here. I am trying to query a DB using an API and this API is using multiple query parameters. one of the query parameters is going to be ids which are more than 100 in count. The catch is that I can’t pass all 100 ids as an array to the API call because API is not designed to accept array for that parameter. It is going to be kind of looping over those 100 ids one by one and then calling the API with new id as the parameter value in each iteration.Also, these IDs needs to be fetched from a snowflake table and then passed as the parameter to the API call. So I am thinking of having some snowflake or JDBC query consumer as as an origin. And these IDs would increase over the period of time so want to make it as dynamic as possible but that is not priority for now. Having multiple jobs to solve this would lead to 100+ jobs and that would keep increasing which is not a good practice at all. Could someone please suggest the best possi
Hi,In my pipeline, I am having a stream selector stage. I want to parameterize it and use the following expression for the condition:${record:value('/rating_text') == '${pipeline_rating_text}'}Here, pipeline_rating_text is my parameter that I have defined for my pipeline. The problem is that when I run the pipeline it does not work. If I use this expression ${record:value('/rating_text') == 'Excellent'}everything will be fine. Can somebody help me, please?
Hi,I have a JDBC connection for a database that is located in a docker container in my local machine. This connection works perfectly when I make a data collector pipeline.I installed a transformer engine in a docker container in my local machine (I have installed the external JDBC libraries). Then, I made a very simple pipeline to read from my database using this JDBC connection. I constantly get this error message that “[JDBC Table 1] Cannot connect to specified database: Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. (JDBC_00)”.Can anyone help me with solving the problem?
Dears,I have configured a local docker image for the Streamsets and kafka images to try a simple Kafka connection, but I´m receiving the following:Http failure response for https://na01.hub.streamsets.com/tunneling/rest/660c2f92-c396-4322-a9ea-cd73758897a1/rest/v1/pipeline/dynamicPreview?TUNNELING_INSTANCE_ID=tunneling-1: 500 OK The kafka image names are:bash-3.2$ docker-compose psNAME IMAGE COMMAND SERVICE CREATED STATUS PORTSkafka wurstmeister/kafka "start-kafka.sh" kafka 27 minutes ago Up 27 minutes 0.0.0.0:9092->9092/tcpzookeeper wurstmeister/zookeeper "/bin/sh -c '/usr/sb…" zookeeper 27 minutes ago Up 27 minutes 22/tcp, 2888/tcp, 3888/tcp, 0.0.0.0:2181->2181/tcp And I try the Test connection kafka:9092 or localhost:9092 and receive the Http failure response for https://na01.hub.streamsets.com/tunneling
I’m getting this error when I’m trying to use Snowflake uploader. I manually created the stage and I know I can put files there. I’m on 5.8 data collector. How do I over come this error?
Hi , i am facing an issue while starting my sdc service and sdc is not coming up. it is prod and need to be up. can someone please help on this at the earliest. Caused by: com.streamsets.datacollector.store.PipelineStoreException: CONTAINER_0206 - Cannot load details for pipeline 'SupplyCha__eed7a3d5-486c-499b-baa1-6eac92aa198b__averydennison.com': java.io.IOException: File '/apps/sdc/data/pipelines/SupplyCha__eed7a3d5-486c-499b-baa1-6eac92aa198b__averydennison.com/pipeline.json-tmp' exists, '/apps/sdc/data/pipelines/SupplyCha__eed7a3d5-486c-499b-baa1-6eac92aa198b__averydennison.com/pipeline.json-old' should exists we have referred link for solution but not sure what can be done as files are not there on those pipelines. Thanks & Regards,Vishal Verma
Become a leader!
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.