I am building a crawler that is meant to keep track of 1mil + product pages and their stats in order to predict trends.Currently I am using node.js and mongo, storing each product as a document with an embedded array of objects that store stats like rating and amount of orders. Stats are added to the document each time the crawler scrapes that page so that we can see historical data.However even with only 80k documents I feel like searching based on stats takes too long. I am hoping you guys can give advice on the best tech to store and search analytic data like this. I'm thinking of switching to an sql database.So, what infrastructure would you use for this type of application? Any guidance would be much appreciated.
Submitted January 10, 2019 at 06:50PM by TheArksmith
No comments:
Post a Comment