Re-process files on failure
Posted: Mon Mar 06, 2017 5:23 pm
Hi,
I have several processes that monitor a directory for a given file (or files), perform some manipulation on them (rename, upload, etc) and then archive the files. These are kicked off from a filesystem monitor directly on the MFT server. One of the last steps (after the uploads, etc) is to remove the source file so it's not processed again the next time the monitor runs. However, we've run into situations where, if the process fails for whatever reason (usually a transient communications issue with the remote site) we *want* the monitor to pick up on the same file and attempt to process it again. However, because the monitor has already "seen" the file, it just ignores it.
What's the best way to work around this? I thought about having an "error" module that updated the timestamp on the file (kind of like the unix "touch" command) so it would look like a new one, but wanted to make sure there's not a better way to go about this.
I have several processes that monitor a directory for a given file (or files), perform some manipulation on them (rename, upload, etc) and then archive the files. These are kicked off from a filesystem monitor directly on the MFT server. One of the last steps (after the uploads, etc) is to remove the source file so it's not processed again the next time the monitor runs. However, we've run into situations where, if the process fails for whatever reason (usually a transient communications issue with the remote site) we *want* the monitor to pick up on the same file and attempt to process it again. However, because the monitor has already "seen" the file, it just ignores it.
What's the best way to work around this? I thought about having an "error" module that updated the timestamp on the file (kind of like the unix "touch" command) so it would look like a new one, but wanted to make sure there's not a better way to go about this.