Search code examples
apache-sparkhiveamazon-emrspark-thriftserver

How to register custom UDF jar in HiveThriftServer2?


In HiveThriftServer2 class, what is the difference between calling the startWithContext vs calling the main?

I have a customer UDF jar that I want to register, so that every time when the thrift server boots up, all these are auto configure. Is there a way to do this?

Can I use Hive context to register the UDF jar and functions and call the HiveThriftServer2.startWithContext to start up the server?

Thanks


Solution

  • What you are looking for is called hive.aux.jars.path, and it's a Hive property, not Spark specific.

    I personally haven't tried it, but I'm thinking something like this

    ./sbin/start-thriftserver.sh \
      --hiveconf hive.aux.jars.path=file:///opt/lib/custom-udfs.jar
    

    References