Ask Your Question

write JSON objects to RDBMS table

asked 2020-03-02 14:59:36 -0500

dan777 gravatar image

I'm a newbie playing around with SDC trying to write the results of a simple REST API call to a single-column RDBMS table (in Snowflake). The single column destination should be the full JSON string for each of the returned "objects" from the API call. The API call is (recent issues for the rails repo) and using an HTTP Origin writing to a Local FS works.

However, writing to Snowflake destination produces a "missing field 'COLUMN_NAME'" where COLUMN_NAME is the name of the single column in the Snowflake table. I'm sure the solution involves a processor merging all of the JSON values/paths into a single "string" before writing to the destination. But I've spent way to much time trying to figure that processor out.

Any insights or hints would be appreciated!

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2020-03-03 13:26:21 -0500

updated 2020-03-09 11:57:08 -0500

I think you need to rename the column before you pass it to Snowflake. I just did this with a local JSON file and got the same message as you until I inserted a Field Renamer between the source and destination in StreamSets. Now I'm getting a data type error:

SNOWFLAKE_05 - Field 'MYCOLUMN' has invalid type 'STRING', Snowflake column type is 'VARIANT'

That's a step in the right direction though. :)

edit flag offensive delete link more



I created my single column in Snowlfake as a variant data type. I changed that to a string and now my pipeline works just fine, inserting one row per JSON record into my destination table.

asturt gravatar imageasturt ( 2020-03-03 13:37:10 -0500 )edit
Login/Signup to Answer

Question Tools

1 follower


Asked: 2020-03-02 14:59:36 -0500

Seen: 23 times

Last updated: Mar 09