I am performing an ETL job via Pentaho 7.1. The job is to populate a table 'PRO_T_TICKETS' in PostgreSQL 9.2 via the Pentaho Jobs and transformations?
I have mapped the table fields with respect to the stream fields
My Table PRO_T_TICKETS contains the Schema (Column Names) in UPPERCASE. Is this the reason I can't populate the table PRO_T_TICKETS with my ETL Job?
I duplicated the step TABLE_OUTPUT to PRO_T_TICKETS and changed the Target table field to 'PRO_T_TICKETS2'. Pentaho created a new table with lowercase schema and populated the data in it.
But I want this data to be uploaded in the table PRO_T_TICKETS only and with the UPPERCASE schema if possible.
I am attaching the whole job here and the error thrown by Pentaho. Pentaho Error I have also tried my query by adding double quotes to the column names as you can see in the error. But it didn't help.
What do you think I should do?
When you create (or modify) the connection, select Advanced
on the left panel and click on the Force to upper case
or Force to lower case
or, even better, Preserve case of reserved words
.
To know which option to choose, copy the 4th line of your error log, the line starting with INSERT INTO "public"."PRO_T_TICKETS("OID"...
in your SQL-developer tool and change the connection advanced parameters until it works.
Also, at debug time, don't use batch updates
, don't use lazy conversion
on previous steps, and try with one (1) field rather than all (25).