Search code examples
pentahokettle

kettle etl passing variable


In my transformation, I created a var (current time formatted to yyyy-mm-dd HH24mmss ) in Modified java script. I then use a set variable step to set the field to a variable and the scope is valid in root job.

The question is how to use that variable in another transformation (in the same job)? I tried get variable, but there seems to be only system variables. What I want to do is output the date to a file in the second transformation. There are more transformations in between, that's why I can't do the output in the first transformation.

Or is it possible to create a variable in the job, and set its value (current date in yyyy-mm-dd HH24mmss) then use it in transformations?

EDIT:

The answer works, but the date is not in my expected format (yyyy-mm-dd HH24mmss), and it's not clear what format the date is. E.g if I try to format it in a modified java script and use getFullYear function on that I get TypeError: Cannot find function getFullYear in object Wed May 25 17:44:04 BST 2016. But if I just output it to a file, the date is in yyyy/mm/dd hh:mm:ss.

So I found another way to do it is use a table input and generate a date to the format desired and set variable, the rest is the same.


Solution

  • In your first transformation use the Get System Info step to inject the current date/time into your data flow and run it into a Set Variables step that sets the variable defined in your Job.

    The variable you're using may not appear in the drop down list when you do CTRL-Space. This is because the variable is allocated by the Job at run time and isn't available at design time. Just type '${VariableName}' into the field at design time. When you run from a job that contains a variable of that name, it should work.