web-csv-toolbox - v0.14.0
    Preparing search index...

    Class StringCSVParserStream<Header, Format>

    A transform stream that converts a stream of strings into a stream of CSV records. Wraps a StringCSVParser instance to provide streaming CSV parsing.

    StringCSVParser instance to use for parsing

    Stream-specific options (backpressureCheckInterval, etc.)

    Strategy for the writable side (default: { highWaterMark: 65536, size: chunk => chunk.length })

    Strategy for the readable side (default: { highWaterMark: 256 })

    Follows the Web Streams API pattern where queuing strategies are passed as constructor arguments, similar to CSVLexerTransformer and CSVRecordAssemblerTransformer.

    Default Queuing Strategy:

    • Writable side: Counts by string length (characters). Default highWaterMark is 65536 characters (≈64KB).
    • Readable side: Counts each record as 1. Default highWaterMark is 256 records.

    Backpressure Handling: The transformer monitors controller.desiredSize and yields to the event loop when backpressure is detected (desiredSize ≤ 0). This prevents blocking the main thread during heavy processing and allows the downstream consumer to catch up.

    import { FlexibleStringObjectCSVParser } from './models/FlexibleStringObjectCSVParser.js';
    import { StringCSVParserStream } from './stream/StringCSVParserStream.js';

    const parser = new FlexibleStringObjectCSVParser({ header: ['name', 'age'] });
    const stream = new StringCSVParserStream(parser);

    new ReadableStream({
    start(controller) {
    controller.enqueue("Alice,30\r\n");
    controller.enqueue("Bob,25\r\n");
    controller.close();
    }
    })
    .pipeThrough(stream)
    .pipeTo(new WritableStream({
    write(record) {
    console.log(record);
    }
    }));
    // { name: 'Alice', age: '30' }
    // { name: 'Bob', age: '25' }
    const parser = new FlexibleStringObjectCSVParser({ header: ['name', 'age'] });
    const stream = new StringCSVParserStream(
    parser,
    { backpressureCheckInterval: 50 },
    { highWaterMark: 131072, size: (chunk) => chunk.length },
    new CountQueuingStrategy({ highWaterMark: 512 })
    );

    await fetch('large-file.csv')
    .then(res => res.body)
    .pipeThrough(new TextDecoderStream())
    .pipeThrough(stream)
    .pipeTo(yourProcessor);

    Type Parameters

    • Header extends ReadonlyArray<string> = readonly string[]

      The type of the header row

    • Format extends "object" | "array" = "object"

      Output format: 'object' or 'array'

    Hierarchy

    Index

    Constructors

    Properties

    Methods

    Constructors

    Properties

    parser: {
        parse(
            chunk?: string,
            options?: { stream?: boolean },
        ): IterableIterator<CSVRecord<Header, Format>>;
    }
    readable: ReadableStream<CSVRecord<Header, Format, "keep">>

    The readable read-only property of the TransformStream interface returns the ReadableStream instance controlled by this TransformStream.

    MDN Reference

    writable: WritableStream<string>

    The writable read-only property of the TransformStream interface returns the WritableStream instance controlled by this TransformStream.

    MDN Reference

    Methods