At my company, we use a distributed pool of virtual machines to run our UI and API tests. These machines are all connected to an onsite server which the pool uses for publishing the test results and outputs.
The problem is our storage on this server is limited, and each day we are producing 500MB - 5GB of reports (csvs, screenshots, txt logs, etc). We would like to preserve these reports for assisting QA in identifying issues, but we end up having to routinely delete large amounts of reports due to the need to free up space.
Recently, we have moved our test scripts and inputs to a Git repo on VSTS. This not only frees up some space on our test server, but also allows for source control.
We want to do the same with the test outputs. The only issue is that the repo for this would be MASSIVE, larger than the tiny local storage allotted to each test machine. And since everything I've found online seems to suggest that each machine would need to have a full copy of the repo in order to push to it, this solution is unworkable.
My question is, how can I go about making this work? Is there a way to push an individual file or collection of files to a VSTS repo without cloning it locally first? I've looked at Git Submodules but I'm unsure at how reliable or stable that would be, since, in order to get this repo to a reasonable size, we would need about 1,500 submodules. Is there a better solution for storing large amounts of test output data?