Search code examples
scalaavroavro-tools

Not a data file error while reading Avro file


I have a file with the data in Avro format. I would like to read this data into GenericRecord type data structure or any other type data structure so I would be able to send it from Kafka to Spark.

I tried to use DataFileReader, however the result was this error:

Exception in thread "main" java.io.IOException: Not a data file.
    at org.apache.avro.file.DataFileStream.initialize(DataFileStream.java:105)

Here is the code with produced it:

val schema = Source.fromFile(schemaPath).mkString
val parser = new Schema.Parser
val avroSchema = parser.parse(schema)
val avroDataFile = new File(dataPath)

val avroReader = new GenericDatumReader[GenericRecord](avroSchema)
val dataFileReader = new DataFileReader[GenericRecord](avroDataFile, avroReader) 
//THIS LINE PRODUCED ERROR

How can I fix this error?

This is how my Avro data schema looks like:

{
  "type" : "record",
  "namespace" : "input_data",
  "name" : "testUser",
  "fields" : [
    {"name" : "name", "type" : "string", "default": "NONE"},
    {"name" : "age", "type" : "int", "default": -1},
    {"name" : "phone", "type" : "string", "default" : "NONE"},
    {"name" : "city", "type" : "string", "default" : "NONE"},
    {"name" : "country", "type" : "string", "default" : "NONE"}
  ]
}

And this is the data I tried to read (it was generated by this tool):

{
  "name" : "O= ~usP3\u0001\bY\u0011k\u0001",
  "age" : 585392215,
  "phone" : "\u0012\u001F#\u001FH]e\u0015UW\u0000\fo",
  "city" : "aWi\u001B'\u000Bh\u00163\u001A_I\u0001\u0001L",
  "country" : "]H\u001Dl(n!Sr}oVCH"
}
{
  "name" : "\u0011Y~\fV\u001Dv%4\u0006;\u0012",
  "age" : -2045540864,
  "phone" : "UyOdgny-hA",
  "city" : "\u0015f?\u0000\u0015oN{\u0019\u0010\u001D%",
  "country" : "eY>c\u0010j\u0002[\u001CdDQ"
}
...

Solution

  • Well, that data is not Avro, it is JSON.

    If it were binary Avro data, you would not be able to read the file without first using avro-tools.jar tojson action.

    If you look at the usage doc, JSON is the default

    -j, --json: Encode outputted data in JSON format (default)
    

    To actually get Avro, use arg -s schema.avsc -b -o out.avro

    There are also other ways to generate test data in Kafka