Not sure if this is possible.
I did a functional test that basically takes a measure every 2ms for 7 minutes and saves this data into essentially a csv file. (it is a different extension but you can open it in excel just like one) I have data files that have something like 211,000 rows in them. I have about 100 of these files for different subjects.
What I would like to do is have a macro that will open the csv and then copy 3 columns in some rows and then paste that into a new workbook, preferably with some way to identify the file it came from and having organization
like say i have 10 files in folder A
It would output something like
Origin File 1 name Row # 1 column A, column B, column C
Origin File 1 name Row # 2 column A, column B, column C
Origin File 2 name Row # 1 column A, column B, column C
Origin File 2 name Row # 2 column A, column B, column C
.
.
.
Origin File 10 name Row # 1 column A, column B, column C
Origin File 10 name Row # 2 column A, column B, column C
I imagine you could do this with a macro but I have no clue how to even go about doing that.
Posts
Edit:
I mean, it can be in Java or whatever other language you're familiar with. I'm thinking that if you're writing cool Excel macros, I'm sure you're knowledgeable enough to use a good programming language and slap something together in a little bit of time!
HA, I am not knowledgeable in anything. Not a programmer, I just know to ask someone else to do it
That's probably a bit vague but hopefully enough to get you started.
Also on Steam and PSN: twobadcats
When I discovered power query back in 2015 or so I cried a little as I watched the last 5 years of my life be wasted doing this type of task in Java, C#, R, Python...
just so much faster. and easier. and more repeatable. etc. etc.
I wouldn't really recommend pandas to a beginner. The csv module should be perfectly fine for this kind of manipulation. Pandas is more for big data.
Be warned that Excel has a roughly 1 million row limit. If you want to combine all the files, the resulting file might not be openable in Excel.
I'm not certain, but I think Excel may open files with more than 1m rows if you have the ram: but you will lose those records.
We noted that some huge files were getting smaller after a small header tweak in excel, and believe this was what was happening, but moved on because, fuck it, python.
But here is a "No, fuck you: Power Query/Pivot!" solution
https://www.masterdataanalysis.com/ms-excel/analyzing-50-million-records-excel/
I am working with output files for a stimulator/force transducer. For our fatigue protocol it is a 7 minute long course of stimulations every 4 secs. But it also takes a measure every 2 ms.
I wanted to grab the measures at each minute ( 0, 1,2,... 7) to make it easier to work with.
I ended up just brute forcing it using the analysis software since it wasn't that bad since it was able to do roughly 30 files at a time.