web-csv-toolbox
    Preparing search index...

    Function parseStringStream

    • Parse CSV string stream to records.

      Type Parameters

      • const CSVSource extends ReadableStream<string>
      • const Delimiter extends string = ","
      • const Quotation extends string = "\""
      • const Header extends readonly string[] = PickCSVHeader<CSVSource, Delimiter, Quotation>

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseStringStream.toArray function.

      Stream Execution Strategies:

      For streams, the engine configuration supports two worker strategies:

      • stream-transfer (recommended): Zero-copy stream transfer to worker
        • Supported on Chrome, Firefox, Edge
        • Automatically falls back to message-streaming on Safari
      • message-streaming: Records sent via postMessage
        • Works on all browsers including Safari
        • Slightly higher overhead but more compatible

      By default, streams use main thread execution. To use workers with streams:

      import { parseStringStream, EnginePresets } from 'web-csv-toolbox';

      // Use worker with automatic stream-transfer (falls back if not supported)
      for await (const record of parseStringStream(stream, {
      engine: EnginePresets.workerStreamTransfer()
      })) {
      console.log(record);
      }

      Note: WASM execution is not supported for streams. If you specify engine: { wasm: true } with a stream, it will fall back to main thread.

      import { parseStringStream } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      },
      });

      for await (const record of parseStringStream(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseStringStream } from 'web-csv-toolbox';

      const response = await fetch('large-file.csv');
      const stream = response.body
      .pipeThrough(new TextDecoderStream());

      // Use worker with stream-transfer strategy
      for await (const record of parseStringStream(stream, {
      engine: { worker: true, workerStrategy: 'stream-transfer' }
      })) {
      console.log(record);
      }
    • Parse CSV string stream to records.

      Type Parameters

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseStringStream.toArray function.

      Stream Execution Strategies:

      For streams, the engine configuration supports two worker strategies:

      • stream-transfer (recommended): Zero-copy stream transfer to worker
        • Supported on Chrome, Firefox, Edge
        • Automatically falls back to message-streaming on Safari
      • message-streaming: Records sent via postMessage
        • Works on all browsers including Safari
        • Slightly higher overhead but more compatible

      By default, streams use main thread execution. To use workers with streams:

      import { parseStringStream, EnginePresets } from 'web-csv-toolbox';

      // Use worker with automatic stream-transfer (falls back if not supported)
      for await (const record of parseStringStream(stream, {
      engine: EnginePresets.workerStreamTransfer()
      })) {
      console.log(record);
      }

      Note: WASM execution is not supported for streams. If you specify engine: { wasm: true } with a stream, it will fall back to main thread.

      import { parseStringStream } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      },
      });

      for await (const record of parseStringStream(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseStringStream } from 'web-csv-toolbox';

      const response = await fetch('large-file.csv');
      const stream = response.body
      .pipeThrough(new TextDecoderStream());

      // Use worker with stream-transfer strategy
      for await (const record of parseStringStream(stream, {
      engine: { worker: true, workerStrategy: 'stream-transfer' }
      })) {
      console.log(record);
      }
    • Parse CSV string stream to records.

      Type Parameters

      • const Header extends readonly string[]

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseStringStream.toArray function.

      Stream Execution Strategies:

      For streams, the engine configuration supports two worker strategies:

      • stream-transfer (recommended): Zero-copy stream transfer to worker
        • Supported on Chrome, Firefox, Edge
        • Automatically falls back to message-streaming on Safari
      • message-streaming: Records sent via postMessage
        • Works on all browsers including Safari
        • Slightly higher overhead but more compatible

      By default, streams use main thread execution. To use workers with streams:

      import { parseStringStream, EnginePresets } from 'web-csv-toolbox';

      // Use worker with automatic stream-transfer (falls back if not supported)
      for await (const record of parseStringStream(stream, {
      engine: EnginePresets.workerStreamTransfer()
      })) {
      console.log(record);
      }

      Note: WASM execution is not supported for streams. If you specify engine: { wasm: true } with a stream, it will fall back to main thread.

      import { parseStringStream } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      },
      });

      for await (const record of parseStringStream(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseStringStream } from 'web-csv-toolbox';

      const response = await fetch('large-file.csv');
      const stream = response.body
      .pipeThrough(new TextDecoderStream());

      // Use worker with stream-transfer strategy
      for await (const record of parseStringStream(stream, {
      engine: { worker: true, workerStrategy: 'stream-transfer' }
      })) {
      console.log(record);
      }