web-csv-toolbox - v0.14.0
    Preparing search index...

    Class BinaryCSVParserStream<Header, Format>

    A transform stream that converts a stream of binary data (BufferSource) into a stream of CSV records. Wraps a BinaryCSVParser instance to provide streaming CSV parsing.

    BinaryCSVParser instance to use for parsing

    Stream-specific options (backpressureCheckInterval, etc.)

    Strategy for the writable side (default: ByteLengthQueuingStrategy({ highWaterMark: 65536 }))

    Strategy for the readable side (default: CountQueuingStrategy({ highWaterMark: 256 }))

    Follows the Web Streams API pattern where queuing strategies are passed as constructor arguments, similar to CSVLexerTransformer and CSVRecordAssemblerTransformer.

    Accepts any BufferSource type (Uint8Array, ArrayBuffer, or other TypedArray views) as input chunks.

    Default Queuing Strategy:

    • Writable side: ByteLengthQueuingStrategy with highWaterMark of 65536 bytes (64KB).
    • Readable side: CountQueuingStrategy with highWaterMark of 256 records.

    Backpressure Handling: The transformer monitors controller.desiredSize and yields to the event loop when backpressure is detected (desiredSize ≤ 0). This prevents blocking the main thread during heavy processing and allows the downstream consumer to catch up.

    import { FlexibleBinaryObjectCSVParser } from './models/FlexibleBinaryObjectCSVParser.js';
    import { BinaryCSVParserStream } from './stream/BinaryCSVParserStream.js';

    const parser = new FlexibleBinaryObjectCSVParser({ header: ['name', 'age'], charset: 'utf-8' });
    const stream = new BinaryCSVParserStream(parser);

    const encoder = new TextEncoder();
    new ReadableStream({
    start(controller) {
    controller.enqueue(encoder.encode("Alice,30\r\n"));
    controller.enqueue(encoder.encode("Bob,25\r\n"));
    controller.close();
    }
    })
    .pipeThrough(stream)
    .pipeTo(new WritableStream({
    write(record) {
    console.log(record);
    }
    }));
    // { name: 'Alice', age: '30' }
    // { name: 'Bob', age: '25' }
    const parser = new FlexibleBinaryObjectCSVParser({ header: ['name', 'age'] });
    const stream = new BinaryCSVParserStream(parser);

    await fetch('large-file.csv')
    .then(res => res.body)
    .pipeThrough(stream)
    .pipeTo(yourProcessor);

    Type Parameters

    • Header extends ReadonlyArray<string> = readonly string[]

      The type of the header row

    • Format extends "object" | "array" = "object"

      Output format: 'object' or 'array'

    Hierarchy

    Index

    Constructors

    Properties

    Methods

    Constructors

    Properties

    parser: {
        parse(
            chunk?: BufferSource,
            options?: { stream?: boolean },
        ): IterableIterator<CSVRecord<Header, Format>>;
    }
    readable: ReadableStream<CSVRecord<Header, Format, "keep">>

    The readable read-only property of the TransformStream interface returns the ReadableStream instance controlled by this TransformStream.

    MDN Reference

    writable: WritableStream<BufferSource>

    The writable read-only property of the TransformStream interface returns the WritableStream instance controlled by this TransformStream.

    MDN Reference

    Methods