web-csv-toolbox
    Preparing search index...

    Function parse

    • Parse CSV to records.

      String, ReadableStream<string | Uint8Array> and Response are supported.

      Type Parameters

      Parameters

      Returns AsyncIterableIterator<CSVRecord<PickCSVHeader<CSVSource>>>

      Async iterable iterator of records.

      If you want array of records, use parse.toArray function.

      parseString, parseBinary, parseUint8ArrayStream, parseStringStream and parseResponse are used internally.

      If you known the type of the CSV, it performs better to use them directly.

      If you want to parse a... Use... Options...
      String parseString ParseOptions
      ReadableStream<String> parseStringStream ParseOptions
      Uint8Array | ArrayBuffer parseBinary ParseBinaryOptions
      ReadableStream<Uint8Array> parseUint8ArrayStream ParseBinaryOptions
      Response parseResponse ParseBinaryOptions

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size, browser and server environments
      • Recommended for: Large files (> 10MB) or memory-constrained environments

      This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.

      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parse(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      }
      });

      for await (const record of parse(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      // This CSV has no header.
      const csv = `Alice,42
      Bob,69`;

      for await (const record of parse(csv, { header: ['name', 'age'] })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name\tage
      Alice\t42
      Bob\t69`;

      for await (const record of parse(csv, { delimiter: '\t' })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
    • Parse CSV to records.

      String, ReadableStream<string | Uint8Array> and Response are supported.

      Type Parameters

      • const Header extends readonly string[]

        Header type like ['name', 'age'].

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parse.toArray function.

      parseString, parseBinary, parseUint8ArrayStream, parseStringStream and parseResponse are used internally.

      If you known the type of the CSV, it performs better to use them directly.

      If you want to parse a... Use... Options...
      String parseString ParseOptions
      ReadableStream<String> parseStringStream ParseOptions
      Uint8Array | ArrayBuffer parseBinary ParseBinaryOptions
      ReadableStream<Uint8Array> parseUint8ArrayStream ParseBinaryOptions
      Response parseResponse ParseBinaryOptions

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size, browser and server environments
      • Recommended for: Large files (> 10MB) or memory-constrained environments

      This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.

      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parse(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      }
      });

      for await (const record of parse(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      // This CSV has no header.
      const csv = `Alice,42
      Bob,69`;

      for await (const record of parse(csv, { header: ['name', 'age'] })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name\tage
      Alice\t42
      Bob\t69`;

      for await (const record of parse(csv, { delimiter: '\t' })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
    • Parse CSV to records.

      String, ReadableStream<string | Uint8Array> and Response are supported.

      Type Parameters

      • const Header extends readonly string[]

        Header type like ['name', 'age'].

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parse.toArray function.

      parseString, parseBinary, parseUint8ArrayStream, parseStringStream and parseResponse are used internally.

      If you known the type of the CSV, it performs better to use them directly.

      If you want to parse a... Use... Options...
      String parseString ParseOptions
      ReadableStream<String> parseStringStream ParseOptions
      Uint8Array | ArrayBuffer parseBinary ParseBinaryOptions
      ReadableStream<Uint8Array> parseUint8ArrayStream ParseBinaryOptions
      Response parseResponse ParseBinaryOptions

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size, browser and server environments
      • Recommended for: Large files (> 10MB) or memory-constrained environments

      This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.

      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parse(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      }
      });

      for await (const record of parse(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      // This CSV has no header.
      const csv = `Alice,42
      Bob,69`;

      for await (const record of parse(csv, { header: ['name', 'age'] })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name\tage
      Alice\t42
      Bob\t69`;

      for await (const record of parse(csv, { delimiter: '\t' })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
    • Parse CSV to records.

      String, ReadableStream<string | Uint8Array> and Response are supported.

      Type Parameters

      • const CSVSource extends string | ReadableStream<string>
      • const Delimiter extends string = ","
      • const Quotation extends string = "\""
      • const Header extends readonly string[] = PickCSVHeader<CSVSource, Delimiter, Quotation>

        Header type like ['name', 'age'].

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parse.toArray function.

      parseString, parseBinary, parseUint8ArrayStream, parseStringStream and parseResponse are used internally.

      If you known the type of the CSV, it performs better to use them directly.

      If you want to parse a... Use... Options...
      String parseString ParseOptions
      ReadableStream<String> parseStringStream ParseOptions
      Uint8Array | ArrayBuffer parseBinary ParseBinaryOptions
      ReadableStream<Uint8Array> parseUint8ArrayStream ParseBinaryOptions
      Response parseResponse ParseBinaryOptions

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size, browser and server environments
      • Recommended for: Large files (> 10MB) or memory-constrained environments

      This function processes CSV data as an async iterable iterator, yielding one record at a time. Memory footprint remains constant regardless of file size, making it ideal for large datasets.

      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parse(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      const stream = new ReadableStream({
      start(controller) {
      controller.enqueue(csv);
      controller.close();
      }
      });

      for await (const record of parse(stream)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      // This CSV has no header.
      const csv = `Alice,42
      Bob,69`;

      for await (const record of parse(csv, { header: ['name', 'age'] })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parse } from 'web-csv-toolbox';

      const csv = `name\tage
      Alice\t42
      Bob\t69`;

      for await (const record of parse(csv, { delimiter: '\t' })) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
    • Parse CSV binary to records.

      Type Parameters

      • const Header extends readonly string[]

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      import { parse } from 'web-csv-toolbox';

      // This CSV data is not gzipped and encoded in utf-8.
      const response = await fetch('https://example.com/data.csv');

      for await (const record of parse(response)) {
      // ...
      }
      import { parse } from 'web-csv-toolbox';

      // This CSV data is gzipped and encoded in shift-jis and has BOM.
      const response = await fetch('https://example.com/data.csv.gz');

      for await (const record of parse(response, {
      charset: 'shift-jis',
      ignoreBOM: true,
      decompression: 'gzip',
      })) {
      // ...
      }