Note: For WASM-enabled features in Node, Node.js 20.6+ is recommended (the WASM loader uses
import.meta.resolve). On older Node versions, pass an explicit URL/Buffer toloadWASM().
Convert Node.js fs.ReadStream to Web Streams API.
import { parseBinaryStream } from 'web-csv-toolbox';
import { createReadStream } from 'node:fs';
import { Readable } from 'node:stream';
// Create Node.js read stream
const nodeStream = createReadStream('data.csv');
// Convert to Web ReadableStream
const webStream = Readable.toWeb(nodeStream) as ReadableStream<Uint8Array>;
// Parse from web stream
let count = 0;
for await (const record of parseBinaryStream(webStream)) {
// Process record (e.g., save to database, validate, etc.)
console.log(record);
count++;
}
console.log(`Parsed ${count} records`);
With charset and options:
import { parseBinaryStream } from 'web-csv-toolbox';
import { createReadStream } from 'node:fs';
import { Readable } from 'node:stream';
const nodeStream = createReadStream('large-file.csv', {
highWaterMark: 64 * 1024 // 64KB chunks
});
const webStream = Readable.toWeb(nodeStream) as ReadableStream<Uint8Array>;
// Parse with options
for await (const record of parseBinaryStream(webStream, {
charset: 'utf-8',
delimiter: ',',
// Process one record at a time (memory efficient)
})) {
// Process each record immediately
await processRecord(record);
}
async function processRecord(record: any) {
// Database insert, API call, etc.
console.log(record);
}
Compressed file streams:
import { parseBinaryStream } from 'web-csv-toolbox';
import { createReadStream } from 'node:fs';
import { Readable } from 'node:stream';
const nodeStream = createReadStream('data.csv.gz');
const webStream = Readable.toWeb(nodeStream) as ReadableStream<Uint8Array>;
// Parse compressed stream
for await (const record of parseBinaryStream(webStream, {
decompression: 'gzip'
})) {
console.log(record);
}