In PyFlink, is it possible to use the DataStream API to create a DataStream by means of StreamExecutionEnvironment's addSource(...), then perform transformations on this data stream using the DataStream API and then convert that stream into a form where SQL statements can be executed on it using the TableApi?
I have a single stream of many different types of events, so I'd like to create many different data streams from a single source, each with a different type of data in it. I was thinking perhaps I could use a side output based on the data in the initial stream and then perform different SQL operations against each of those streams, safe in the knowledge of what the data in each of those separate streams actually is. I don't want to have a different Flink job for each data type in the stream.
Yes, you can convert a DataStream to a Table API that will allow you to execute SQL statements: https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/table/data_stream_api/
What you want to do to split the stream by type seems reasonable to me