web-csv-toolbox - v0.14.0
    Preparing search index...

    Interface BinaryCSVProcessingOptions<Header, Delimiter, Quotation>

    CSV processing specification options for binary data.

    Extends CSVProcessingOptions with binary-specific options like charset, decompression, and buffer size limits. This excludes execution strategy.

    Used by low-level Binary Parser classes. High-level APIs use ParseBinaryOptions which adds EngineOptions.

    const parser = new FlexibleBinaryObjectCSVParser({
    delimiter: ',',
    header: ['name', 'age'],
    charset: 'utf-8',
    decompression: 'gzip',
    // engine is NOT available here
    });
    interface BinaryCSVProcessingOptions<
        Header extends ReadonlyArray<string> = ReadonlyArray<string>,
        Delimiter extends string = DEFAULT_DELIMITER,
        Quotation extends string = DEFAULT_QUOTATION,
    > {
        allowExperimentalCompressions?: boolean;
        allowNonStandardCharsets?: boolean;
        charset?: string;
        columnCountStrategy?: ColumnCountStrategy;
        decompression?: CompressionFormat;
        delimiter?: Delimiter;
        fatal?: boolean;
        header?: Header;
        ignoreBOM?: boolean;
        includeHeader?: boolean;
        maxBinarySize?: number;
        maxBufferSize?: number;
        maxFieldCount?: number;
        outputFormat?: "object" | "array";
        quotation?: Quotation;
        signal?: AbortSignal;
        skipEmptyLines?: boolean;
        source?: string;
    }

    Type Parameters

    Hierarchy (View Summary)

    Index

    Properties

    allowExperimentalCompressions?: boolean

    Allow experimental or non-standard compression formats not explicitly supported by this library.

    When true, compression formats from Content-Encoding headers that are not in the default supported list will be passed to the runtime's DecompressionStream without validation. This allows using compression formats that may not be universally supported across all browsers.

    When false (default), only universally supported formats are allowed:

    • Node.js: gzip, deflate, br (Brotli)
    • Browsers: gzip, deflate

    Some compression formats are only supported in specific environments:

    • deflate-raw: Supported in Chromium-based browsers (Chrome, Edge) but may not work in Firefox or Safari
    • br (Brotli): Future browser support may vary
    • Other formats: Depends on runtime implementation

    If you enable this option and use deflate-raw:

    • ✅ Works in Chrome, Edge (Chromium-based)
    • ❌ May fail in Firefox, Safari
    • Consider implementing fallback logic or detecting browser support at runtime

    Use with caution: Enabling this bypasses library validation and relies entirely on runtime error handling. If the runtime doesn't support the format, you'll get a runtime error instead of a clear validation error from this library.

    false
    
    // Safe mode (default): Only universally supported formats
    const response = await fetch('data.csv.gz');
    await parse(response); // ✓ Works in all browsers

    // Experimental mode: Allow deflate-raw (Chromium-only)
    const response = await fetch('data.csv'); // Content-Encoding: deflate-raw
    await parse(response, { allowExperimentalCompressions: true });
    // ✓ Works in Chrome/Edge
    // ✗ May fail in Firefox/Safari

    // Browser-aware usage
    const isChromium = navigator.userAgent.includes('Chrome');
    await parse(response, {
    allowExperimentalCompressions: isChromium
    });
    allowNonStandardCharsets?: boolean

    Allow non-standard character encodings not in the common charset list.

    When true, charset values from Content-Type headers that are not in the default supported list will be passed to the runtime's TextDecoder without validation. This allows using character encodings that may not be universally supported across all environments.

    When false (default), only commonly used charsets are allowed, including:

    • UTF: utf-8, utf-16le, utf-16be
    • ISO-8859: iso-8859-1 through iso-8859-16
    • Windows: windows-1250 through windows-1258
    • Asian: shift_jis, euc-jp, gb18030, euc-kr, etc.

    Use with caution: Enabling this bypasses library validation and relies entirely on runtime error handling. Invalid or malicious charset values could cause:

    • Runtime exceptions from TextDecoder
    • Unexpected character decoding behavior
    • Potential security vulnerabilities

    It's recommended to validate charset values against your expected inputs before enabling this option.

    false
    
    // Safe mode (default): Only commonly supported charsets
    const response = await fetch('data.csv');
    await parse(response); // charset must be in SUPPORTED_CHARSETS

    // Allow non-standard charset
    const response = await fetch('data.csv'); // Content-Type: text/csv; charset=custom-encoding
    await parse(response, { allowNonStandardCharsets: true });
    // ⚠️ May throw error if runtime doesn't support the charset
    charset?: string

    You can specify the character encoding of the binary.

    TextDecoderStream is used internally.

    See Encoding API Compatibility for the encoding formats that can be specified.

    'utf-8'
    
    columnCountStrategy?: ColumnCountStrategy
    decompression?: CompressionFormat

    If the binary is compressed by a compression algorithm, the decompressed CSV can be parsed by specifying the algorithm.

    Make sure the runtime you are running supports stream decompression.

    See DecompressionStream Compatibility.

    delimiter?: Delimiter

    CSV field delimiter. If you want to parse TSV, specify '\t'.

    Detail restrictions are as follows:

    • Must not be empty
    • Must be a single character
      • Multi-byte characters are not supported
    • Must not include CR or LF
    • Must not be the same as the quotation
    ','
    
    fatal?: boolean

    If the binary has a invalid character, you can specify whether to throw an error.

    If the property is true then a decoder will throw a TypeError if it encounters malformed data while decoding.

    If false the decoder will substitute the invalid data with the replacement character U+FFFD (�).

    See TextDecoderOptions.fatal for more information.

    false
    
    header?: Header
    ignoreBOM?: boolean

    If the binary has a BOM, you can specify whether to ignore it.

    If you specify true, the BOM will be ignored. If you specify false or not specify it, the BOM will be treated as a normal character. See TextDecoderOptions.ignoreBOM for more information about the BOM.

    false
    
    includeHeader?: boolean
    maxBinarySize?: number

    Maximum binary size in bytes for ArrayBuffer/Uint8Array inputs.

    This option limits the size of ArrayBuffer or Uint8Array inputs to prevent memory exhaustion attacks. When the binary size exceeds this limit, a RangeError will be thrown.

    Set to Number.POSITIVE_INFINITY to disable the limit (not recommended for untrusted input).

    100 * 1024 * 1024 (100MB)
    
    maxBufferSize?: number

    Maximum internal buffer size in characters.

    This option limits the size of the internal buffer used during lexing to prevent memory exhaustion attacks. The buffer size is measured in UTF-16 code units (JavaScript string length). When the buffer exceeds this limit, a RangeError will be thrown.

    Set to Infinity to disable the limit (not recommended for untrusted input).

    10 * 1024 * 1024 (approximately 10MB for ASCII, but may vary for non-ASCII)
    
    maxFieldCount?: number
    outputFormat?: "object" | "array"
    quotation?: Quotation

    CSV field quotation.

    '"'
    
    signal?: AbortSignal

    The signal to abort the operation.

    If the signal is aborted, the operation will be stopped.

    const controller = new AbortController();

    const csv = "foo,bar\n1,2\n3,4";
    try {
    const result = await parse(csv, { signal: controller.signal });
    } catch (e) {
    if (e instanceof DOMException && e.name === "AbortError") {
    console.log("Aborted");
    }
    }

    // Abort with user action
    document.getElementById("cancel-button")
    .addEventListener("click", () => {
    controller.abort();
    });
    const csv = "foo,bar\n1,2\n3,4";

    try {
    const result = await parse(csv, { signal: AbortSignal.timeout(1000) });
    } catch (e) {
    if (e instanceof DOMException && e.name === "TimeoutError") {
    console.log("Timeout");
    }
    }
    undefined
    
    skipEmptyLines?: boolean
    source?: string

    Source identifier for error reporting (e.g., filename, description).

    This option allows you to specify a human-readable identifier for the CSV source that will be included in error messages. This is particularly useful when parsing multiple files or streams to help identify which source caused an error.

    Security Note: Do not include sensitive information (API keys, tokens, full URLs) in this field as it may be exposed in error messages and logs.

    parseString(csv, { source: "users.csv" });
    // Error: Field count exceeded at row 5 in "users.csv"
    undefined