Setting Decimal Field Scale & Precision (Groovy)

asked 2020-09-23 13:37:08 -0600

erik gravatar image

updated 2020-09-23 13:42:09 -0600

I am reading in Avro data (all written as String types) via a Hadoop FS origin and writing to Hadoop FS and Hive. Most of these fields need to be converted to other data types to match the target table schema which is already created, including decimals. I have a JDBC lookup that is getting the column names and data types to convert too, I was able to write more complex expressions for all the other data type conversions, but decimals is proving to be a bit of a pain, in particular setting the scale and precision for each decimal field dynamically according to the lookup results, for example a lookup for a decimal field will return "decimal(16,2)" easy to parse out scale and precision, but then I need to set them on the field.

Seems the Field Type Converter doesn't support expressions for the Scale value, only an integer. So, I am offloading decimal field conversions to a Groovy Evaluator, where I can mimic record headers like the JDBC origin, I am just not sure based on the documentation how to set scale and precision on a decimal field via Groovy. Is this possible?

edit retag flag offensive close merge delete