A friend , that uses some of my solutions, was with a problem. He has a central repository server ( he uses to some monitoring too) and in this server has a folder called FTPDownload. In this folder has several files downloaded automatically by FTP.
In a specific file (CSV File) that is downloaded one time per day, he needs a solution to identify only this file and store in a other SQL Server. He has a LOT of “robot-programs” that do some operations like that to other files. This programs stay all the time reading the folder to check if some new file is in there.
What he asked to me is if we can do something and does not need to create another “robot” or change the code from the existents.
YES WE CAN !!!!
First lets create the WQL to Monitor the specific file in a specific folder : for us the folder is c:\FTPdownload and the file is FileImport.CSV
Then Lets create the Job called IMPORTCSV with a PowerShell code called importCSV.ps1 on c:\FTPScripts
Now it is time to create the SQL WMI Alert to monitor the creation of this file,based on our WQL :
Now lets response the alert to the JOB IMPORTCSV :
Alert Code :
And done . Every time that a file called Fileimport.csv is created on folder c:\FTPDownload,5 minutes after (Clause Within on WQL 300 seconds) the alert is fired and run the JOB. Why 300 ? Just to have time to the file are completed saved in the folder.
TIP : if you have some problem with the file and generate a error, even you using Try-Catch the job will finish with no errors,because the exit code is 0 .The error will be show on the job history ,but I need that the job finishes with error.
How to solve this ? Just add throw “Failure”on the catch block and the Exit code change to 1 :
and the Job will finish with error :
Now you can send an email using PowerShell or by the SQL job informing the error or successful . it is up to you.
After all, it’s all about automated solutions