Thursday 21 March 2019

Can someone here help me out a bit? Having issues with either Node, promises, or sequelize. In over my head...

Hey there r/node, thank you for providing this community. I didn't even know about it until a few months ago, and I've tried to be helpful, but ... someone usually beats me to it! :)​This time, I need a hand. I'm running a Node app on an Ubuntu machine, it connects to a remote MySQL database, hosted on a digital ocean droplet. The droplet isn't the tiniest size, and I've done bigger things than this with MySQL over the years, without these sorts of issues. I'm having a hard time with this, I'm overdue, my mom's going to ground me if I don't get this finished, and then I won't get my $10 for the book fair next week. :(​In a nutshell, I use the filesystem to pipe data from a csv file into an array, so I have an array, with each element being a [Firetruck Name, Firetruck Location, Firetruck Owner, Firetruck Age]. All I'm interested is, in one little piece, the "Firetruck Name,"...​So, I created a csvdata.forEach loop, and inside of that I used a Sequelize model of my firetruck to store it in the remote database. The table has two columns: ID, FireTruck_Name. All of the mapping works correctly, I can call models.Firetruck.findOrCreate({where: {FireTruck_Name: fireTruckName}}) - just once, for example, and all works well.​But when I'm looping through my csvdata using forEach, I run into errors like this...​name: 'SequelizeConnectionAcquireTimeoutError',parent:TimeoutError: Operation timeoutat Timeout._timeout.setTimeout (/media/noel/FILE_LOCKER/DEV/PCP2/node_modules/sequelize-pool/lib/Deferred.js:19:19)at ontimeout (timers.js:436:11)at tryOnTimeout (timers.js:300:5)at listOnTimeout (timers.js:263:5)at Timer.processTimers (timers.js:223:10),original:TimeoutError: Operation timeoutat Timeout._timeout.setTimeout (/media/noel/FILE_LOCKER/DEV/PCP2/node_modules/sequelize-pool/lib/Deferred.js:19:19)at ontimeout (timers.js:436:11)​​Here's the relevant code... myCSVArray = []; = this array contains 100,000+ rows of CSV data... var updateFiretrucks = () => { var i = 0, j = 0; console.log("Updating Firetrucks..."); myCSVArray.forEach((row) => { var exists = false; strTruck = row["Engine Name"]; for(i = 0; i < lsFiretrucks.length; i++) { if (typeof lsFiretrucks[i] !== 'undefined') { truck = lsFiretrucks[i]; } if(strTruck === truck) { exists = true; } } if (!exists) { models.Firetruck.findOrCreate({ where: { Title: strTruck} }) .spread(function (truckResult, created) { lsFiretrucks.push(truckResult.EngineName); lsFiretrucksID.push(truckResult.ID); if (created) { console.log(truckResult + " added to the database."); } }) .catch((error) => { console.log(error); }); } }) } I've looked through every forum, web site, and whatnot, and I DO see how Sequelize 5.x was supposed to solve the very issue I'm having.My connection pooling options look like this:"pool:": { "max": 1000, "min": 0, "idle": 650000, "acquire": 1000000 } Connection pooling seems to be the issue that comes up a lot, but setting mine to reasonably high values, and then the sky-high-probably-don't-work values you see here haven't produced favorable results.I added a few arrays to first check if the values in the array (from the csv), have already been seen, and if so, we can skip the whole database / sequelize stuff. Either it's not being called right, or being called once, but I'm getting duplicated results inside of the database... which I understand... because I can see the commits start in the console, more commits start, and then some of the original commits ending, before the first few have ended. Which is screwed-up. :)I've been reading through the Node documentation about streaming, pipeline, and backpressuring... but it's like knocking on dead wood right now - I've been at this for 13 hours.Help? Please?

Submitted March 22, 2019 at 02:45AM by YourQuestIsComplete

No comments:

Post a Comment