Skip to content

export to spark StructType misinterprets decimal precision and scale #996

@quimort

Description

@quimort

I have the following column defined in my data contract YAML

- name: competitor_sale_price_gross_local_val 
  id: competitor_sale_price_gross_local_val
  physicalName: competitor_sale_price_gross_local_val
  businessName: Competitor sale price gross local value
  physicalType: decimal
  description: "Sales price from the competitor"
  examples:
    - "xyz" 

after using the to_spark_dict function from datacontract.export.spark_exporter the StructType I get has this column as decimal (38,0), I've check the code to see how to set the precision and scale

if physical_type in ["decimal", "numeric"]:
precision = _get_logical_type_option(prop, "precision") or 38
scale = _get_logical_type_option(prop, "scale") or 0

def _get_logical_type_option(prop: SchemaProperty, key: str):
"""Get a logical type option value."""
if prop.logicalTypeOptions is None:
return None
return prop.logicalTypeOptions.get(key)

But logicalTypeOptions does not accept nether precision or scale and I get the following error

Run operation failed: [lint] Check that data contract YAML is valid - None - ResultEnum.failed - data.schema[0].properties[3].logicalTypeOptions must not contain {'precision', 'scale'} properties - datacontract

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions