I seem to be missing something on the Stream - Static Join in Spark 2.2.
The manual states that such a join is possible, but I cannot get the syntax correct. Odd. No watermark being used.
val joinedDs = salesDs
.join(customerDs, "customerId", joinType="leftOuter")
Error gotten is as follows, but I am pretty sure I have the sides right:
<console>:81: error: overloaded method value join with alternatives:
(right: org.apache.spark.sql.Dataset[_],joinExprs:
org.apache.spark.sql.Column,joinType: String)org.apache.spark.sql.DataFrame <and>
(right: org.apache.spark.sql.Dataset[_],usingColumns: Seq[String],joinType: String)org.apache.spark.sql.DataFrame
cannot be applied to (org.apache.spark.sql.Dataset[Customer], String, joinType: String)
.join(customerDs, "customerId", joinType="left_Outer")
^
For some reason when adding joinType I also needed to add the Seq.
.join(customerDs, Seq("customerId"), "left_Outer")