Tuesday 9 January 2018

My API is getting slower as I get more data. Any ideas for optimisation? (MongoDB, Express, Node 8)

Hey guys,This recently came to mind because some API I'm working with is getting progressively slower as I get more data. Things like hitting the memory limit and asking to enable 'allowDiskUse' when using aggregate to taking around 20 seconds to work with larger sets of data.I have pondered about the following:Remove the use of mongo's aggregate and try and apply my own transformations outside of mongo?Piping results straight from mongo, though I'm not sure how I'd handle doing some transformations of the data between the streaming and the cursorMove to a different database technology like PostgresQLSo I was wondering if you guys have any idea on how to handle working with large amounts of data coming out of Mongo, handling it on the server and how to serve it. Never really had much experience working with this sort of problem so a bit of guidance would be appreciated. Thanks!

Submitted January 09, 2018 at 10:48PM by cruzcontrol56

No comments:

Post a Comment