In a current project, I use file names to store data, then extract these data and create some goods informations.
Actually, my code maeke extensive NthField(Name, Delimiter, Indice).
With 500 master folders to scan (from 36 to 68 files each folder), the search takes several seconds (5 ?).
I was wondering if using
Array() = Split(Name, Delimiter)
TA.AppendText Array(1) // then 2, etc.
will be faster.
(Feel free to send Advices too if you have some).
I have found that getting file names in a folder can be slow. I’m not sure if this is what you’re doing, but if so, get all the file names in the folder once. Store them in an array or dictionary. Then do your processing.
What I do for string manipulation is to split the string into an array of characters. That speeds things up a lot. For instance if you have some HTML and need to insert something into each
<img> element, or replace the “phoning home” 1px x 1px images that spamming scrotes insert into their spam, I have to find the
<img> element and then replace the argument of the
src="some URL here" tag with a reference to my own 1px x 1px image, etc etc.
Always use join/split where possible.
And maybe assemble smaller strings, before adding to bigger ones.
Of course don’t optimize things where only maybe 10 strings are involved.
Optimize where you have thousands of them.
If you use the Children iterator I think you’ll find that this problem mostly goes away.
Thank you all.
I do not think about the “read the file names” part, but “process the file name”.
So, I have to make test to try to understand what part takes most time or simply use Split in a second button and check the time for both buttons
In macOS, reading 480 Child names is not slow. Reading 36 to 68 Childs of these Childs (Grand Sons…) with the “fields separation” takes some seconds (5 or 6). Including the html concatenation / write in a TextArea (using AppendText).