web-csv-toolbox
    Preparing search index...

    Function parseString

    • Parse CSV string to records.

      Type Parameters

      • const CSVSource extends string

      Parameters

      Returns AsyncIterableIterator<CSVRecord<PickCSVHeader<CSVSource>>>

      Async iterable iterator of records.

      If you want array of records, use parseString.toArray function.

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size
      • Recommended for: Large CSV strings (> 10MB) or memory-constrained environments

      Execution Strategies: Control how parsing is executed using the engine option:

      • Main thread (default): engine: { worker: false } - No overhead, good for small files
      • Worker thread: engine: { worker: true } - Offloads parsing, good for large files
      • WebAssembly: engine: { wasm: true } - Fast parsing, limited to UTF-8 and double-quotes
      • Combined: engine: { worker: true, wasm: true } - Worker + WASM for maximum performance

      Use EnginePresets for convenient configurations:

      import { parseString, EnginePresets } from 'web-csv-toolbox';

      // Use fastest available execution method
      for await (const record of parseString(csv, {
      engine: EnginePresets.fastest()
      })) {
      console.log(record);
      }
      import { parseString } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parseString(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseString } from 'web-csv-toolbox';

      // Offload parsing to a worker thread
      for await (const record of parseString(largeCSV, {
      engine: { worker: true }
      })) {
      console.log(record);
      }
    • Parse CSV string to records.

      Type Parameters

      • const Header extends readonly string[]

      Parameters

      • csv: string

        CSV string to parse

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseString.toArray function.

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size
      • Recommended for: Large CSV strings (> 10MB) or memory-constrained environments

      Execution Strategies: Control how parsing is executed using the engine option:

      • Main thread (default): engine: { worker: false } - No overhead, good for small files
      • Worker thread: engine: { worker: true } - Offloads parsing, good for large files
      • WebAssembly: engine: { wasm: true } - Fast parsing, limited to UTF-8 and double-quotes
      • Combined: engine: { worker: true, wasm: true } - Worker + WASM for maximum performance

      Use EnginePresets for convenient configurations:

      import { parseString, EnginePresets } from 'web-csv-toolbox';

      // Use fastest available execution method
      for await (const record of parseString(csv, {
      engine: EnginePresets.fastest()
      })) {
      console.log(record);
      }
      import { parseString } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parseString(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseString } from 'web-csv-toolbox';

      // Offload parsing to a worker thread
      for await (const record of parseString(largeCSV, {
      engine: { worker: true }
      })) {
      console.log(record);
      }
    • Parse CSV string to records.

      Type Parameters

      • const Header extends readonly string[]

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseString.toArray function.

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size
      • Recommended for: Large CSV strings (> 10MB) or memory-constrained environments

      Execution Strategies: Control how parsing is executed using the engine option:

      • Main thread (default): engine: { worker: false } - No overhead, good for small files
      • Worker thread: engine: { worker: true } - Offloads parsing, good for large files
      • WebAssembly: engine: { wasm: true } - Fast parsing, limited to UTF-8 and double-quotes
      • Combined: engine: { worker: true, wasm: true } - Worker + WASM for maximum performance

      Use EnginePresets for convenient configurations:

      import { parseString, EnginePresets } from 'web-csv-toolbox';

      // Use fastest available execution method
      for await (const record of parseString(csv, {
      engine: EnginePresets.fastest()
      })) {
      console.log(record);
      }
      import { parseString } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parseString(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseString } from 'web-csv-toolbox';

      // Offload parsing to a worker thread
      for await (const record of parseString(largeCSV, {
      engine: { worker: true }
      })) {
      console.log(record);
      }
    • Parse CSV string to records.

      Type Parameters

      • const CSVSource extends string
      • const Delimiter extends string = ","
      • const Quotation extends string = "\""
      • const Header extends readonly string[] = PickCSVHeader<CSVSource, Delimiter, Quotation>

      Parameters

      Returns AsyncIterableIterator<CSVRecord<Header>>

      Async iterable iterator of records.

      If you want array of records, use parseString.toArray function.

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size
      • Recommended for: Large CSV strings (> 10MB) or memory-constrained environments

      Execution Strategies: Control how parsing is executed using the engine option:

      • Main thread (default): engine: { worker: false } - No overhead, good for small files
      • Worker thread: engine: { worker: true } - Offloads parsing, good for large files
      • WebAssembly: engine: { wasm: true } - Fast parsing, limited to UTF-8 and double-quotes
      • Combined: engine: { worker: true, wasm: true } - Worker + WASM for maximum performance

      Use EnginePresets for convenient configurations:

      import { parseString, EnginePresets } from 'web-csv-toolbox';

      // Use fastest available execution method
      for await (const record of parseString(csv, {
      engine: EnginePresets.fastest()
      })) {
      console.log(record);
      }
      import { parseString } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parseString(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseString } from 'web-csv-toolbox';

      // Offload parsing to a worker thread
      for await (const record of parseString(largeCSV, {
      engine: { worker: true }
      })) {
      console.log(record);
      }
    • Parse CSV string to records.

      Parameters

      • csv: string

        CSV string to parse

      • Optionaloptions: ParseOptions<readonly string[], ",", "\"">

        Parsing options. See ParseOptions.

      Returns AsyncIterableIterator<CSVRecord<string[]>>

      Async iterable iterator of records.

      If you want array of records, use parseString.toArray function.

      Performance Characteristics:

      • Memory usage: O(1) - constant per record (streaming approach)
      • Suitable for: Files of any size
      • Recommended for: Large CSV strings (> 10MB) or memory-constrained environments

      Execution Strategies: Control how parsing is executed using the engine option:

      • Main thread (default): engine: { worker: false } - No overhead, good for small files
      • Worker thread: engine: { worker: true } - Offloads parsing, good for large files
      • WebAssembly: engine: { wasm: true } - Fast parsing, limited to UTF-8 and double-quotes
      • Combined: engine: { worker: true, wasm: true } - Worker + WASM for maximum performance

      Use EnginePresets for convenient configurations:

      import { parseString, EnginePresets } from 'web-csv-toolbox';

      // Use fastest available execution method
      for await (const record of parseString(csv, {
      engine: EnginePresets.fastest()
      })) {
      console.log(record);
      }
      import { parseString } from 'web-csv-toolbox';

      const csv = `name,age
      Alice,42
      Bob,69`;

      for await (const record of parseString(csv)) {
      console.log(record);
      }
      // Prints:
      // { name: 'Alice', age: '42' }
      // { name: 'Bob', age: '69' }
      import { parseString } from 'web-csv-toolbox';

      // Offload parsing to a worker thread
      for await (const record of parseString(largeCSV, {
      engine: { worker: true }
      })) {
      console.log(record);
      }