Tuesday 30 October 2018

Monitor large files efficiently (REST API)

Hey,I'm working on a project, which consists of the following components:- Backend(node.js)- Frontend(react.js)Furthermore I'm about to write a REST API in C#(.NET Core), which will be distributed among a few clients. The REST API should monitor a few system properties: windows services, disk space, currently running processes, RAM etc. and send back the data to the requester. The requester in most case will be the backend, which accesses the REST API endpoints and stores the responses in an POSTGRESQL database.To simplify this: Accessing the frontend and requesting current system information about a specific system (a system in this case is just a Windows Server, which I want to monitor), should result in the frontend communicateing with the backend, to request up to date system information from the systems REST API. The backend proceeds the requests by making requests to the specific systems REST API. The response gets stored by the backend, as well as proceeded to the frontend.So far everything works fine. Now I want to monitor files, e.g. logs, on any system. The challenge I'm facing is how to approach this. I want the REST API to be as slick and small as possible, leaving as less as possible footprints on any system. Given this, monitoring, or just filtering for reg. expressions, via the REST API(on the systemside/clientside) and sending back the results doesn't seem efficient to me, since it can put a lot of stress on the system. On the other hand, I could simply create a REST API endpoint to transfer the file via a decoded stream, encode it on the backend and filter the file on the server. This seems reasonable to me.However I have files which can be up to 400 MB. Any recommendations how I should deal with them?This is my first bigger project, feel free to correct me on my ideas and thoughts. Please! :)

Submitted October 30, 2018 at 09:04AM by Fasyx

No comments:

Post a Comment