Search code examples
sqlrpostgresqlcsvpostgresql-copy

Importing and maintaining multiple csv files into PostgreSQL


I am new to using SQL, so please bear with me.

I need to import several hundred csv files into PostgreSQL. My web search has only indicated how to import many csv files into one table. However, most csv files have different column types (all have one line headers). Is it possible to somehow run a loop, and have each csv imported to a table with the same name as the csv? Creating each table manually and specifying columns is not an option. I know that COPY will not work as the table needs to already by specified.

Perhaps this is not feasible in PostgreSQL? I would like to accomplish this in pgAdmin III or the PSQL console, but I am open to other ideas (using something like R to change the csv to a format more easily entered into PostgreSQL?).

I am using PostgreSQL on a Windows 7 computer. It was requested that I use PostgreSQL, thus the focus of the question.

The desired result is a database full of tables, that I will then join with a spreadsheet that includes specific site data. Thanks!


Solution

  • Use pgfutter.

    The general syntax looks like this:

       pgfutter csv       
    

    In order to run this on all csv files in a directory from Windows Command Prompt, navigate to the desired directory and enter:

       for %f in (*.csv) do pgfutter csv %f
    

    Note that the path for the downloaded program must be added to the list of accepted paths for Environmental Variables.

    EDIT: Here is the command line code for Linux users

    Run it as

       pgfutter *.csv       
    

    Or if that won't do

       find -iname '*.csv' -exec pgfutter csv {} \;