Thursday 26 September 2019

I built module for working with JSON. It's working but alpha. How useful is it? Should I continue?

I feel like a lot of JS applications rely on JSON.parse / JSON.stringifyBut then one day they grow up a little and run into JSON that is bigger than they initially allowed for. Whether one huge JSON that is too big to even fit in memory, or just JSON that's big enough that for not blocking you just don't want to process it synchronously.So, I built a drop-in replacement from JSON that works asynchronously. Just change JSON.parse(string) to await JSLOB.parse(stringOrStream) and the same for stringify/streamifyIt stores the key value pairs in any level compliant DB, which you can pass in, whether you want it to be in-memory, on-disk, or clustered.So loading/serializing is working fine, as well as asynchronously accessing properties on the resulting object (see README for examples)But, there's just a ton of other functionality to built, like asynchronous setting, iterating keys, etc. Not to mention code clean up.I'm trying to figure out whether this is worth working on further. Would this be valuable to someone? I imagine it should be, both in data engineering use cases, as well as simply having your webserver not block while it tries to parse a large JSON sent by a user (without having to tediously rewrite your code to stream)Here's the project: https://www.npmjs.com/package/jslobThoughts? Feedback?

Submitted September 27, 2019 at 02:32AM by Why_Is_The_RSS_Gone

No comments:

Post a Comment