OK, I've been beating my head against the wall for a couple hours today trying to figure this out. I've seen a few possible solutions, but none of them have been a perfect fit - sorry for the long read but the criteria are somewhat exacting. I hope someone here will have a better idea.
I'm trying to come up with a way to automatically enforce our policy of deleting data from our FTP sites if it's older than 30 days. We're running Titan FTP on the Windows Server 2008 R2 Datacenter OS. I first checked with Titan to see if they had a function to do this automatically. They don't, so I turned to a 3rd party.
Cyber-D Autodelete (http://cyber-d.blogspot.com/2005/10/cyber-ds-auto-delete-101.html) seemed to do everything we needed, but it only runs at system startup. This is a web server hosting web pages, so it rarely gets rebooted, so that wasn't going to work. I could kick it off manually but the idea is to remove the human element (and therefore the likelihood of cut corners).
The description on that post recommended Belvedere as an alternative (http://windowssoftwaretips.blogspot.com/2010/04/belvedere.html). This one was great! It had active monitoring instead of relying on a scheduled task, but it treated folders as if they were files. If I have a data file a couple layers deep in the file tree, the topmost folder doesn't get its modification date changed when a file is uploaded to a subfolder a layer or two down, and Belvedere would delete the entire directory and its contents because the parent directory was older than the 30 days specified.
OK, fine, if you want something done right, do it yourself, yeah? So I decided to roll up my sleeves and search for a batch or script solution. RoboCopy's move command seemed like the logical choice - there's even existing info about it (like http://community.spiceworks.com/topic/447528-automatically-move-files-older-than-x-days?headlined, or http://www.codesingh.com/2009/08/using-robocopy-to-delete-old-files-from.html). I haven't seen any information about excluding directories from RoboCopy, though. There are flags to exclude a specific directory, but I just want all directories ignored entirely. Also, according to Technet, "the MOVE option will fail if any files are open and locked," which an incoming FTP connection could do, could it not?
The ideal solution would:
1. Actively monitor for old files or run as a scheduled task
2. Ignore all directories without knowing any of those directories' names, working only on files in the file tree. If all the files in a directory are older than 30 days, ideally the folder should still be there (as an example, we have FTP users with folders for Art, Data, and Proofs, and if those could always stay there no matter how old they or the files they contain are, or even if they have no files, that would be great)
3. Not require me to get a new degree to know how to implement it
Does anyone know of an application or script that would meet these criteria?