I have an express server that needs to write about 100 html files into a directory in the project root.I can build this server in 2 ways: the server can receive the request from the browser then start the 100 requests and send back a 200 when it finishes, or I can send a 200 after every successful file write and just go back and forth like that for every file.My question is, when using express/node.js, which is better performance wise?In scenario 1, should the server just be sitting there doing all that work for multiple users? Will I run into memory issues? Because I could put that list of files in the browser and make separate calls to my node server to retrieve and write each file from the external API.This is what that first scenario sort of looks like. The route gets a list of files, then makes a call for each file.//this whole request takes about 3 minutes to complete due to rate limiting of the external APIs. router.get('/api/myroute', (req, res, next) => { //contact a remote server's API, it sends back a big list of files. REMOTE_SERVER.file_list.list(USER_CREDS.id).then(files => { //we need to get the contents of each specific file, so we do that here. Promise.all(files.map((item, i) => //they have an API for specific files, but you need the list of those files first like we retrieved above. //more specifically you need a key for each file that is in the file_list object. REMOTE_SERVER.specific_file.get(USER_CREDS.id, { file: { key: files[i].key } }).then(asset => { //write the contents of each file to a directory called "my_files" in the project root. fs.writeFile('./my_files/' + file.key, file.value, function (err) { if (err) { console.log(err); }; }); }))) //send back a 200 only when the entire list has been retrieved and written to our directory. .then(() => { console.log("DONE!!"); res.status(200).send(); }) }); });
Submitted June 21, 2019 at 01:42AM by joWebDev
No comments:
Post a Comment