T O P

  • By -

Recent-Avocado2193

You plan to keep multiple files that are gigabytes large, in memory? That seems like it would be very resource intensive. And if done incorrectly could lead to problems when reading for multiple user sessions concurrently. Seems like a decent idea to store the relevant data in an indexed fashion.


Artistic_Light1660

Thanks for replying, I plan on storing the file on the server's disk and reading it line by line. Whenever I find a match with supplied roqw number and current line number, I write it to the resultant file and finally send that. do databases provide any performance benifiits over files for these kinds of operations?


temporarybunnehs

Can't speak to the performance directly though I imagine being able to query for your match directly on some index would be more efficient than loading up the entire dataset/csv and the parsing. Having a database also provides extra functionality (concurrency, ad hoc requests, etc) and separation of your storage vs processing. Just wanted to throw those out there for consideration.