Search code examples
hadoopavroparquet

ParquetWriter outputs empty parquet file in a java stand alone program


I tried to convert existing avro file to parquet. But the output parquet file is empty. I am not sure what I did wrong...

My code snippet:

    FileReader<GenericRecord> fileReader = DataFileReader.openReader(
            new File("output/users.avro"), new GenericDatumReader<GenericRecord>());

    Schema avroSchema = fileReader.getSchema();

    // generate the corresponding Parquet schema
    MessageType parquetSchema = new AvroSchemaConverter().convert(avroSchema);

    // choose compression scheme
    CompressionCodecName compressionCodecName = CompressionCodecName.UNCOMPRESSED;

    // set Parquet file block size and page size values
    int pageSize = 64 * 1024;

    Path outputPath = new Path("output/users.parquet");

    // create a parquet writer using builder
    ParquetWriter parquetWriter = (ParquetWriter) AvroParquetWriter.builder(outputPath)
            .withSchema(avroSchema)
            .withCompressionCodec(compressionCodecName)
            .withPageSize(pageSize)
            .build();

    // read avro, write parquet
    while (fileReader.hasNext()) {
        GenericRecord record = fileReader.next();

        System.out.println(record);

        parquetWriter.write(record);
    }

Solution

  • I had the same problem and found that I needed to close the parquetWriter before the data was committed to the file. It just needs you to add

    parquetWriter.close();
    

    after the while loop.