CSV string to parse.
Async iterable iterator of records.
If you want array of records, use parse.toArray function.
parseString, parseBinary, parseBinaryStream, parseStringStream, parseResponse, parseRequest, and parseBlob are used internally.
If you known the type of the CSV, it performs better to use them directly.
Performance Characteristics:
This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(csv);
controller.close();
}
});
for await (const record of parse(stream)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
Parse CSV to records.
String, ReadableStream<string | Uint8Array>, Response, Request, Blob, and File are supported.
Header type like ['name', 'age'].
CSV string to parse.
Async iterable iterator of records.
If you want array of records, use parse.toArray function.
parseString, parseBinary, parseBinaryStream, parseStringStream, parseResponse, parseRequest, and parseBlob are used internally.
If you known the type of the CSV, it performs better to use them directly.
Performance Characteristics:
This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(csv);
controller.close();
}
});
for await (const record of parse(stream)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
Parse CSV to records.
String, ReadableStream<string | Uint8Array>, Response, Request, Blob, and File are supported.
Header type like ['name', 'age'].
Async iterable iterator of records.
If you want array of records, use parse.toArray function.
parseString, parseBinary, parseBinaryStream, parseStringStream, parseResponse, parseRequest, and parseBlob are used internally.
If you known the type of the CSV, it performs better to use them directly.
Performance Characteristics:
This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(csv);
controller.close();
}
});
for await (const record of parse(stream)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
Parse CSV to records.
String, ReadableStream<string | Uint8Array>, Response, Request, Blob, and File are supported.
Header type like ['name', 'age'].
Async iterable iterator of records.
If you want array of records, use parse.toArray function.
parseString, parseBinary, parseBinaryStream, parseStringStream, parseResponse, parseRequest, and parseBlob are used internally.
If you known the type of the CSV, it performs better to use them directly.
Performance Characteristics:
This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
for await (const record of parse(csv)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
import { parse } from 'web-csv-toolbox';
const csv = `name,age
Alice,42
Bob,69`;
const stream = new ReadableStream({
start(controller) {
controller.enqueue(csv);
controller.close();
}
});
for await (const record of parse(stream)) {
console.log(record);
}
// Prints:
// { name: 'Alice', age: '42' }
// { name: 'Bob', age: '69' }
Parse CSV binary to records.
import { parse } from 'web-csv-toolbox';
// This CSV data is not gzipped and encoded in utf-8.
const response = await fetch('https://example.com/data.csv');
for await (const record of parse(response)) {
// ...
}
import { parse } from 'web-csv-toolbox';
// This CSV data is gzipped and encoded in shift-jis and has BOM.
const response = await fetch('https://example.com/data.csv.gz');
for await (const record of parse(response, {
charset: 'shift-jis',
ignoreBOM: true,
decompression: 'gzip',
})) {
// ...
}
Parse CSV to records.
String, ReadableStream<string | Uint8Array>, Response, Request, Blob, and File are supported.