Ask Your Question

Error with Google BigQuery destination

asked 2017-10-09 03:00:10 -0600

prachi gravatar image

updated 2017-10-09 10:14:50 -0600

metadaddy gravatar image

I am using Kafka Consumer as the origin and Google Bigquery as the destination (with StreamSets DataCollector of my pipeline. The Data Format at the origin is JSON. I get the following error:
'Root field of record should be a Map or a List Map'

How can I resolve this issue?

edit retag flag offensive close merge delete



What is the type of record you are trying to write? If you do a preview, you should easily be able to see. My guess is that you are consuming from Kafka as TEXT (the default) and hence the root field is simply a STRING.

jeff gravatar imagejeff ( 2017-10-09 09:35:01 -0600 )edit

Thanks for your suggestion. The data format at Kafka Consumer stage is already JSON. Setting the JSON Content option to 'JSON array of objects' instead of 'Multiple JSON objects' solved the issue.

prachi gravatar imageprachi ( 2017-10-09 23:30:19 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2017-10-09 23:30:24 -0600

prachi gravatar image

updated 2017-10-10 14:05:21 -0600

metadaddy gravatar image

Setting Data Format to JSON and JSON Content to JSON array of objects solved the issue. image description

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2017-10-09 03:00:10 -0600

Seen: 254 times

Last updated: Oct 10 '17