Search code examples
pentahokettlepdi

Setting varible from shell script in Pentaho kettle which can be accessed by further jobs


I wanted to know how can I set an variable from shell job available in pentaho kettle, which can be accessible by further Jobs(Simple evaluation) in the workflow.

I am trying to create a workflow where I have a start element which would trigger as shelljob to check the folder presence, if the folder is present then set one variable. The next job is Simple evaluation which needs to check if the variable(Set by shell job) is true that proceed with the workflow or terminate the workflow.

Start-->ShellJob(check folder created and set variable)-->SimpleEvaluation Job.

--MIK


Solution

  • Good question. I'm not aware of such capability, as the "Execute a shell script..." step isn't designed to be a data pipeline. Furthermore, what values should/can a script return to you? Is it the result of an echo? A shell script could essentially be anything. I would say there's a reason why there is no built-in functionality for that in PDI.

    Having said that, what you could do is something like this:

    1. Execute a script, at the end of it write the variables into a text file on the file system
    2. Create a sub-transformation that reads the variables from the file you've written in the shell script step, and then stores it/them in global scope variable
    3. Evaluate the variables in the job

    It may seem a bit cumbersome, but it should do the job for you, since you're asking to use the Shell Script step in a way it's not really designed to be used.

    Here's an example of a high-level implementation (implementation of the sub-transformation should be very simple):

    example, high-level implementation

    I hope it helps.