I'm working on a project where we produce lots of data in the form of simple .txt files. These usually contain a set of columns that will later be loaded in to Matlab to be analysed. I've implemented an application that uses a database to store information about the person who uses the application and should be able to match that information to the results that are created in the .txt files. The files are long; nearing 10000 rows and with 3-5 columns.
My current application design simply aims to link a query to the database with the file using an Id which works fine provided the files are put in a specific folder and kept there on the server so that the database can always find them.
Should I have designed a way to store the rows and rows of data in the database even though it would just be 100000s of numbers linked to one or two users and session Ids? What is the best practice in this situation.