Search code examples
javaapache-sparkspark-streamingspark-structured-streaming

Watermark not showing correct output in spark


I am sending streaming data to spark using netcat server:

nc -lk 9999

I am sending data in following format:

Time,number

In spark, I am splitting them and performing a groupby operation. Here is my code:

package org.example;

import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.*;
import org.apache.spark.sql.streaming.StreamingQuery;
import org.apache.spark.sql.streaming.StreamingQueryException;

import java.util.concurrent.TimeoutException;
import static org.apache.spark.sql.functions.*;
import org.apache.spark.sql.streaming.Trigger;


public class SampleProgram {
    public static void main(String args[]) {
        SparkSession spark = SparkSession
                .builder()
                .appName("Spark-Kafka-Integration")
                .config("spark.master", "local")
                .getOrCreate();

        spark.sparkContext().setLogLevel("ERROR");

        Dataset<Row> lines = spark
                .readStream()
                .format("socket")
                .option("host", "localhost")
                .option("port", 9999)
                .load();

        lines.printSchema();

       Dataset<Row> temp_data = lines.selectExpr("split(value,',')[0] as timestamp","split(value,',')[1] as value");
       Dataset<Row> data = temp_data.selectExpr("CAST(timestamp AS TIMESTAMP)", "CAST(value AS INT)");

        Dataset<Row> windowedCounts = data
                .withWatermark("timestamp", "10 minutes")
                .groupBy(
                    functions.window(data.col("timestamp"), "5 minutes"),
                        col("value")
                ) .count();

        StreamingQuery query = null;
        try {
            query = windowedCounts.writeStream()
                    .outputMode("update")
                    .option("truncate", "false")
                    .format("console")
                    .trigger(Trigger.ProcessingTime(" 45 seconds"))
                    .start();
        } catch (TimeoutException e) {
            throw new RuntimeException(e);
        }

        try {
            query.awaitTermination();
        } catch (StreamingQueryException e) {
            throw new RuntimeException(e);
        }


    }
}

The issue which I am facing is this -

When I give the input, say, 10:00:00,5 it gives this output.

Output given

Now, at this point of time, max event time is 10:00:00 and I have specified watermark as 10 minutes, so any event before (10:00:00-00:10:00) i.e. 09:50:00 should be rejected. However, when I give input say 09:48:00,10 it gives this output -

output given

Which seems incorrect to me because the data is already too late, it should be rejected by spark, but spark is considering it. What am I missing here ?


Solution

  • Write groupby in this way

    .groupBy(
         window(col("timestamp"),"5 minutes"),
         col("value")
    ).count();