Below is by SBT configuration
// https://mvnrepository.com/artifact/org.apache.flink/flink-scala
libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.9.1"
// https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.9.1"
// https://mvnrepository.com/artifact/org.apache.flink/flink-table-api-scala-bridge
libraryDependencies += "org.apache.flink" %% "flink-table-api-scala-bridge" % "1.9.1"
// https://mvnrepository.com/artifact/org.apache.flink/flink-table-planner
libraryDependencies += "org.apache.flink" %% "flink-table-planner" % "1.9.1"
// https://mvnrepository.com/artifact/org.apache.flink/flink-table-common
libraryDependencies += "org.apache.flink" % "flink-table-common" % "1.9.1"
// https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka-base
//libraryDependencies += "org.apache.flink" %% "flink-connector-kafka_2.11" % "1.9.1" -- This not working and throwing unable to connect error.
Since I was not able to add the flink-connector-kafka via sbt, I downloaded the jar and put it in lib(created lib) folder in my sbt project. The sbt project is created via IntelliJ and only I added the lib folder manually.
Now, when I am importing the kafka connector package i.e import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
is working fine.
Now below is my code to consume from Kafka
import java.util.Properties
import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer
import org.apache.flink.api.scala._
object KafkaFlink {
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val properties = new Properties()
// properties.setProperty("bootstrap.servers", "localhost:9092")
// // only required for Kafka 0.8
// properties.setProperty("zookeeper.connect", "localhost:2181")
// properties.setProperty("group.id", "test")
val properties1 = new Properties()
properties.setProperty("bootstrap.servers", "localhost:9092")
properties.setProperty("group.id", "test")
val topic = "flink-fault-testing"
val flinkKafkaConsumer = new FlinkKafkaConsumer[String](topic, new SimpleStringSchema(), properties1)
val value: DataStream[String] = env.addSource(flinkKafkaConsumer)
}
}
I am unable to compile because I am getting the error cannot resolve the overloaded method "addSource"
Please provide inputs as to where I am doing wrong.
Also if there is a way to get Universal flink-kafka connector directly via build.sbt of IntelliJ
What you want to specify in your SBT config is this
libraryDependencies += "org.apache.flink" %% "flink-connector-kafka" % "1.9.1"
If you leave off the "_2.11" then it should work. That's indicating which scala version to use, and SBT handles scala versioning itself.
Not sure why your code doesn't compile. Looks okay to me.