/ live.thehmm.nl / back / node_modules / stream-json /

[ICO]NameLast modifiedSizeDescription
[PARENTDIR]Parent Directory  -  
[DIR]filters/2 years ago -  
[DIR]jsonl/2 years ago -  
[DIR]streamers/2 years ago -  
[DIR]utils/2 years ago -  
[   ]Assembler.js40 years ago3.4K 
[   ]Disassembler.js40 years ago6.0K 
[   ]Emitter.js40 years ago438  
[   ]LICENSE40 years ago1.9K 
[   ]Parser.js40 years ago 20K 
[TXT]README.md40 years ago9.5Kd7c1522 post receive test [كارل مبارك]
[   ]Stringer.js40 years ago4.6K 
[   ]index.js40 years ago217  
[   ]package.json2 years ago2.1K 
README.md

stream-json NPM version

stream-json is a micro-library of node.js stream components with minimal dependencies for creating custom data processors oriented on processing huge JSON files while requiring a minimal memory footprint. It can parse JSON files far exceeding available memory. Even individual primitive data items (keys, strings, and numbers) can be streamed piece-wise. Streaming SAX-inspired event-based API is included as well.

Available components:

All components are meant to be building blocks to create flexible custom data processing pipelines. They can be extended and/or combined with custom code. They can be used together with stream-chain to simplify data processing.

This toolkit is distributed under New BSD license.

Introduction

const {chain}  = require('stream-chain');

const {parser} = require('stream-json');
const {pick}   = require('stream-json/filters/Pick');
const {ignore} = require('stream-json/filters/Ignore');
const {streamValues} = require('stream-json/streamers/StreamValues');

const fs   = require('fs');
const zlib = require('zlib');

const pipeline = chain([
  fs.createReadStream('sample.json.gz'),
  zlib.createGunzip(),
  parser(),
  pick({filter: 'data'}),
  ignore({filter: /\b_meta\b/i}),
  streamValues(),
  data => {
    const value = data.value;
    // keep data only for the accounting department
    return value && value.department === 'accounting' ? data : null;
  }
]);

let counter = 0;
pipeline.on('data', () => ++counter);
pipeline.on('end', () =>
  console.log(`The accounting department has ${counter} employees.`));

See the full documentation in Wiki.

Companion projects:

Installation

npm install --save stream-json
# or: yarn add stream-json

Use

The whole library is organized as a set of small components, which can be combined to produce the most effective pipeline. All components are based on node.js streams, and events. They implement all required standard APIs. It is easy to add your own components to solve your unique tasks.

The code of all components is compact and simple. Please take a look at their source code to see how things are implemented, so you can produce your own components in no time.

Obviously, if a bug is found, or a way to simplify existing components, or new generic components are created, which can be reused in a variety of projects, don't hesitate to open a ticket, and/or create a pull request.

Release History

Apache/2.4.38 (Debian) Server at www.karls.computer Port 80