Home » Questions » Computers [ Ask a new question ]

Searching directories for tons of files?

Searching directories for tons of files?

"I'm using MSVE, and I have my own tiles I'm displaying in layers on top. Problem is, there's a ton of them, and they're on a network server. In certain directories, there are something on the order of 30,000+ files. Initially I called Directory.GetFiles, but once I started testing in a pseudo-real environment, it timed out.

What's the best way to programatically list, and iterate through, this many files?

Edit: My coworker suggested using the MS indexing service. Has anyone tried this approach, and (how) has it worked?"

Asked by: Guest | Views: 248
Total answers/comments: 5
Guest [Entry]

"I've worked on a SAN system in the past with telephony audio recordings which had issues with numbers of files in a single folder - that system became unusable somewhere near 5,000 (on Windows 2000 Advanced Server with an application in C#.Net 1.1)- the only sensible solution that we came up with was to change the folder structure so that there were a more reasonable number of files. Interestingly Explorer would also time out!

The convention we came up with was a structure that broke the structure up in years, months and days - but that will depend upon your system and whether you can control the directory structure..."
Guest [Entry]

Definitely split them up. That said, stay as far away from the Indexing Service as you can.
Guest [Entry]

"None. .NET relies on underlying Windows API calls that really, really hate that amount of files themselves.

As Ronnie says: split them up."
Guest [Entry]

You could use DOS?
Guest [Entry]

You could also look at either indexing the files yourself, or getting a third part app like google desktop or copernic to do it and then interface with their index. I know copernic has an API that you can use to search for any file in their index and it also supports mapping network drives.