I currently have a Talend job which reads from a context file and feeds into context variables. I have a field called ftppassword
and store the hard coded password in the context file. I then have a context variable in the job and refer to that in my job.
With this setup my job runs fine but if I change the context file to contain a location to a password file instead of the hard coded password, I get the following exception:
Exception in component
tFTPConnection_1 com.enterprisedt.net.ftp.FTPException: 530 Login
incorrect. at
com.enterprisedt.net.ftp.FTPControlSocket.validateReply(FTPControlSocket
.java:1179) at
com.enterprisedt.net.ftp.FTPClient.password(FTPClient.java:1844) at
com.enterprisedt.net.ftp.FTPClient.login(FTPClient.java:1766) –
**Edit - 2014-12-08 **** Output of context parameters: Implicit_Context_Context set key "ftphost" with value "ftp.host.com" Implicit_Context_Context set key "ftpport" with value "21" Implicit_Context_Context set key "ftpusername" with value "myuser" Implicit_Context_Context set key "ftppassword" with value "/opt/password_files/DW/test1.password" Implicit_Context_Context set key "ftpremotepath" with value "/Output/" Implicit_Context_Context set key "ftpfilemask" with value "test_dn.zip" Have also tried changing the data type of ftppassword to File and Password but had no luck with that.
The implicit tContextLoad
option on the job is the equivalent of putting a tFileInputDelimited component at the start of your job with a schema of 2 columns: key
and value
. This is then read into a tContextLoad (hence the option name) to load the contexts in your job.
If your password file isn't in a key-value format then you can't use it this way.
The simplest option is to stick with the way you had it working before and use an implicit tContextLoad
to load a delimited file with key-value pairs of your context variables.
Another option would be to no longer do this using the implicit tContextLoad
option and instead to do it explicitly.
To do this you'd want to read in your password file using an appropriate connector such as a tFileInputDelimited. If you were reading in something that looked like /etc/passwd
then you could split it on :
to get:
You could then use a tMap to populate an output schema of:
You would then enter "ftppassword"
as the key
and connect the password value to the value
column. You'll also want to filter this record set so you only get one password being set so you might want to use something like "ftpUser".equals(row1.username)
in the expression filter of your output table in the tMap.
Then just connect this to a tContextLoad component and your job should load the password from /etc/passwd
for the "ftpUser"
user account.
If you are looking to pass a file path to another file containing the password so that you can split the dependencies and allow one file to contain all the other contexts for the job but to keep the password file elsewhere then instead you'd want to pass a context variable pointing to the password file but then you'd have to explicitly consume it in the job.
In this case you may have a context file that is loaded at run time with contexts such as ftpremotepath
, ftphost
and ftpfilemask
that can be set directly in the file and then a ftpusercredentials
context variable that is a file path to a separate credentials file.
This file could then be another delimited file containing key-value pairs of context name and value such as:
ftpuser,myuser
ftppasswd,p4ssw0rd
Then at the start of your job you would explicitly read this in using a tFileInputDelimited component with a schema of 2 columns: key
and value
. You could then connect this to a tContextLoad component and this will load the second set of context variables into memory as well.
You could then use these as normal by referring to them as context.ftpuser
and context.ftppasswd
.