Search code examples
sql-serverazure-data-factoryavroon-premises-instancescopy-data

Why doesn't Data Factory let me copy data from an OnPremise SQL Server to an Avro in the cloud?


The message that appears is the following: ErrorCode=AvroFailedToSerializeData,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Serializae Avro Data failed. Message: Name 'GENERIC_COLUMN' contains invalid characters..,Source=Microsoft.DataTransfer.Common,''Type=System.Runtime.Serialization.SerializationException,Message=Name 'GENERIC_COLUMN' contains invalid characters.,Source=Microsoft.Hadoop.Avro,'

The problem arises that those columns in string format are not being able to be migrated to the avro. I have already done migrating from a SQL Server On premise to Avro in the cloud, I like to use Avro because I thought it was more consistent in the integrity of the data than perhaps a csv which is weaker. What would be the problem?


Solution

  • Serializae Avro Data failed. Message: Name 'GENERIC_COLUMN' contains invalid characters.

    The issue you are getting because the GENERIC_COLUMN name might contain some invalid characters like -,* and special characters. Arro file only allow [A-Z,a-z,_] in the field names.

    In Mapping you can check all fields name for Avro file if it contains special character remove it.

    enter image description here

    Also check the schema of your destination Avro file if its incorrect for naming you can correct it.