Search code examples
node.jsnode-csv-parsefast-csv

How to skip invalid rows while parsing csv file node js


I am using node js lib fast-csv to parse csv and insert into database. In the csv there are rows which are marked are zeros like below

row - 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0

How to skip these rows which parsing.Below is the code i am using

import { createReadStream } from "fs";
import * as pg from 'pg';
const { Pool } = pg.default;
import { parse } from "fast-csv";

const insertdata = async () => {

    let stream = createReadStream("./testfolder/test.csv");
    let csvData = [];
    let csvStream = parse()
        .on("data", function (data) {
            csvData.push(data)
            // console.log(csvData)
        })
        .on("end", function () {
            // remove the first line: header
            csvData.shift();
        });

    stream.pipe(csvStream);
}

await insertdata();

export { insertdata }

Solution

  • If you don't want rows where all the values are zero, then you want to be thinking of something with the following structure: If you don't meet some condition, then don't push the data.

            .on("data", function (data) {
                if (!isAllZeros) {
                    csvData.push(data)
                }
            })
    

    And you can replace isAllZeros with this expression:

    Object.values(data).every(value => value === '0')
    

    This makes use of Object.values to get all the values as an array, and Array.prorotype.every to see if they're all zero.


    So it'll look like this:

            .on("data", function (data) {
                if (!Object.values(data).every(value => value === '0')) {
                    csvData.push(data)
                }
            })