csv-parser is a streaming CSV parsing library for Node.js designed for efficiency and correctness. It implements the stream API native to Node, allowing you to pipe a file or readable stream into the parser and process each row (as a JavaScript object or array) as soon as it's parsed — which is crucial for handling large CSV files without loading them entirely into memory. The parser handles standard CSV semantics including quoted fields, variable delimiters, escape sequences, and optional headers; this makes it robust for a variety of CSV dialects you might encounter. Because it works incrementally (row by row), it is well suited for ETL pipelines, data ingestion workflows, CSV-to-database imports, or any context where you need to process or transform large tabular data in Node.js efficiently. Using the .on('data') / .on('end') (or equivalent async patterns), you can accumulate, filter, transform, or stream data further downstream without waiting for the whole file.
Features
- Streaming API: parses CSV file data row-by-row rather than loading full file into memory
- Supports CSV conventions: quoted fields, delimiters, escapes, and optional headers
- Returns each row as a JS object or array for easy downstream processing
- Handles large CSV files efficiently, making it suitable for big-data ETL and import workflows
- Compatible with Node.js streams, enabling piping from file streams or other readable sources
- Simple API with .on('data'), .on('end') events — easy to integrate into existing Node.js applications