Search code examples
rapache-sparkapache-spark-sqlsparklyrr-dbi

SparklyR wrapper for spark SQL: sqlContext.sql


I am trying to write a wrapper for SQL function for SparklyR. I have created the following function:

sqlfunction <- function(sc, block) {
  spark_context(sc) %>% 
invoke("sqlContext.sql", block) }

Then I call it using the following:

newsqlData <- sqlfunction(sc, "select
                          substr(V1,1,2),
                          substr(V1,3,3),
                          substr(V1,6,6),
                          substr(V1,12,4),
                          substr(V1,16,4)
                          FROM TABLE1 WHERE V1 IS NOT NULL")

But I get the following error:

Error: java.lang.IllegalArgumentException: invalid method sqlContext.sql for object 12
at sparklyr.Invoke$.invoke(invoke.scala:113)
at sparklyr.StreamHandler$.handleMethodCall(stream.scala:89)
at sparklyr.StreamHandler$.read(stream.scala:55)
at sparklyr.BackendHandler.channelRead0(handler.scala:49)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
at java.lang.Thread.run(Thread.java:745)

Any suggestions or fixes would be greatly appreciated.


Solution

  • It should be:

    sqlfunction <- function(sc, block) {
      spark_session(sc) %>% invoke("sql", block)
    }
    

    where sc is spark_connection (the output from: spark_connect(master = master_url)).

    This:

    • spark_session(sc) - retrieves SparkSession from the connection object.
    • invoke("sql", block) - calls sql method of the SparkSession instance with block as an argument.

    with example usage:

    library(sparklyr)
    
    sc <- spark_connect(master = "local[*]")
    sqlfunction(sc, "SELECT SPLIT('foo,bar', ',')")
    
    <jobj[11]>
      class org.apache.spark.sql.Dataset
      [split(foo,bar, ,): array<string>]
    

    This will give you a reference to Java object. If you want you can for example register is as a temporary table:

    ... %>% invoke("createOrReplaceTempView", "some_name_for_the_view")
    

    and access with tbl:

    library(dplyr)
    
    tbl(sc, "some_name_for_the_view") 
    

    or

    ... %>% sdf_register()
    

    to get tbl_spark object directly.

    Code you use:

    • spark_context - extracts SparkContext instance.
    • invoke("sqlContext.sql", block) - tries to call non-existent method (sqlContext.sql).

    In the latest versions you can replace invoke("createOrReplaceTempView", ...) with simple sdf_register.