Welcome to the Mythicsoft Q&A site for:

- Agent Ransack
- FileLocator Lite
- FileLocator Pro

Please feel free to ask any questions on these products or even answer other community member questions.

Useful Links:

- Contact Us
- Help Manuals
- Mythicsoft Home
0 votes

Hi all,

Is it possible to avoid corrupt files, or at least specify a number of attempts at searching or time spent searching per file?

A recent search I made on around 750GB of data saw 10GB of memory consumed even though I'd specified maximum memory consumption of 2500MB.

This seemed to be down to a corrupt 8KB Excel file which the search appeared to be throwing all resources at.

Removing this single file and re-running the search worked as expected (with no more than 2500MB consumed at any point).

Thanks in advance for any help!


asked by (190 points)

3 Answers

0 votes

I am afraid there's no way to currently do this but I'll make sure it's added to the Issue Tracker as a problem.

If possible could you please send the bad XLS file to support@mythicsoft.com.

answered by (65.9k points)

Thanks for that. Unfortunately although I'd love to send the file over to help, it wouldn't be appropriate.

0 votes

I'd like to raise this again as I have a following question, and the issue is consuming quite a lot of time for us which I'd like to reduce if at all possible.

We have pretty huge data sets that we need to search, with deep directory structures. If a corrupt file is encountered on a search, the behaviour is normally as exhibited above. Usually you can make out roughly where the file causing the problem is located from the status bar at the bottom of the screen, however if it's buried deep within a structure then it's very difficult to narrow down.

Is there a better way to see which file is currently being searched?

Once the file is identified, we can remove it an rerun the search, usually successfully.

Thanks for your help,


answered by (190 points)

For the file types that are causing a problem, e.g. DOC, DOCX, try switching the 'Use IFilter if available' option in the Configuration settings. IFilters tend to be better at handling bad files. A sample 'bad' file would be very useful.

0 votes

I think I've found an acceptable workaround for this. If you encounter a corrupt file, FL will continue to interogate the file for an unusually long time (essentially, it will never finish). The path and name of the file will be shown in the status bar at the bottom of the screen. If the path is too long and your resolution isn't good, you might not be able to see the full path, in which case you might need to investigate further.

Find the offending file

Pause the search (do not stop the search, this will make FL unresponsive and all will be lost)

Move the file (you can't delete the file at this point, but moving seems to work)

Resume the search

Correct for version 7.2 on Server 2008.

answered by (190 points)

I'd also like to add that I would love to give you a file to work with (as a developer I understand the need) but the NDA I've signed prevents me from doing this.