I created this so I could run multiple SQL Scripts that are saved in a folder and output the results into individual CSV files.
It could be altered to simply run the SQL scripts which may be stored procedure calls etc instead fairly easily if that's your desire by removing the pipe output to CSV.
The invoke SQL line does all the work but I've put it in a for each loop so it picks up each .sql script in the set folder. I've also included logging to a file so you can see what happened and what any errors were.
When writing any code I try and not hard code anything and use variables instead, one of the first things I learnt when learning vb all those years ago. Doing that allows for the code to be reusable and easier to adjust.
All you need to do to use this is adjust the variables at the top, you could also set those so you can call the script remotely providing the values.
I created this only the other day so I'm sure I'll update it, if I add anything flash I'll make sure to share.
# Running this locally you may need to run the two below lines first to set up SQL snap ins
# add-pssnapin sqlserverprovidersnapin100
# add-pssnapin sqlservercmdletsnapin100
$csvOutputFilePath = "C:\Scripts" #Set Output File Path
$sqlScriptsFilePath = "C:\Scripts" #Set Location of Scripts
$LogFilePath = "C:\Scripts" #Set Location of Logs
$Server = "YourServerName" #Set Database Server
$Database = "YourDatabaseName" #Set Database to use
$QueryTimeout = 1800
$LogFileName= "RunScriptsLog_" + (get-date -f yyyy-MM-dd-HH-mm-ss-ms) + ".Log" #Set up the Logfile name to use
$Script:LogFile = "$LogFilePath\$LogFileName" #Setup the Path for the log with the filename
#Add Header info to Log file
add-content $LogFile "SQL Script Execution"
add-content $LogFile "BeginTime: $BeginTime"
add-content $LogFile "CompleteTime: $CompleteTime"
add-content $LogFile "Server: $Server"
add-content $LogFile "Database: $Database"
add-content $LogFile "CSV Output File Path: $csvOutputFilePath"
add-content $LogFile "SQL Scripts File Path: $sqlScriptsFilePath"
add-content $LogFile "Log File Path: $LogFilePath"
add-content $LogFile "Query Timeout: $QueryTimeout"
add-content $LogFile ""
#Go find the scripts to run
#We are only interested in files that end with .sql
$instanceNameList = Get-ChildItem $sqlScriptsFilePath -filter "*.sql"
#For Each Script file found in the Script Folder run the below
foreach($instanceName in $instanceNameList)
{#We use a try so we can catch any errors
try
{
$BeginTime = get-date #Record Time Script Started
$OutPutFileName = "$instanceName" #Set the Out put file name to the name of the Script
#Then we manipulate the Output File name a little to
#Remove .sql from the name
#add a timestamp
#add the csv extension
$OutPutFileName = $Database + " - " + $OutPutFileName.Replace(".sql", "") + "_" + (get-date -f yyyy-MM-dd-HH-mm-ss-ms) + ".csv"
write-host "Executing query against server: " $instanceName "OutPutFileName: " $OutPutFileName #Output to screen what script is about to be run and what file it will create
#Run the Script on the set Server/Database and output it to the file
Invoke-Sqlcmd -ServerInstance $Server -Database $Database -QueryTimeout $QueryTimeout -InputFile "$sqlScriptsFilePath\$instanceName" | Export-Csv -Path "$csvOutputFilePath\$OutPutFileName" -NoTypeInformation
#Add details to the Log file on what happened including the Begin and Complete Times.
$CompleteTime = get-date
add-content $LogFile "Script Execution - BeginTime: $BeginTime. CompleteTime: $CompleteTime. Script: $instanceName. OutPutFileName: $OutPutFileName $_"
}
catch #If there is an error we catch it here and output the error to the screen and to the Log File
{
$CompleteTime = get-date
write-host "**Error** running Script: $instanceName. BeginTime: $BeginTime. CompleteTime: $CompleteTime.
**Error** $_" #$_ appends the output which in this case should be the error message
add-content $LogFile "**Error** running Script: $instanceName BeginTime: $BeginTime. CompleteTime: $CompleteTime.
**Error** $_"
}
}
Showing posts with label Powershell. Show all posts
Showing posts with label Powershell. Show all posts
Wednesday, 20 March 2013
Saturday, 16 March 2013
Syncing folders - Folders
Ive been using Powershell for the last 6-12 months now just on and off. I find it great to play with being able to run things on the go and linking in with the different applications gives you a lot of freedom.
One of the earliest pieces of tried was to sync two folders. The reason I wanted to do this is because I keep a lot of our photos etc. in the cloud for safe keeping. But after reading of others having hackers get into their cloud storage and wipe it all I thought it would be a good idea to keep an onsite backup of the offsite backup.
So using either Google Drive, Drop box or Skydive where you can sync the files back to your computer is one thing I use and then I take a backup of that. Thats where I thought it would be fun to have a play with powershell to sync those folders to a backup directory.
So to start off with like any good developer I went and googled to see who has done this already and I found a good post by Steven Murawski which does it all for me.
Now to call this script I created a command file in notepad which included the below
Now I admit there isnt a whole of development on my behalf on this one but this is how I learn things, I go and look at what someone else has done and pull it apart to see how I can use the different components.
At a later date I will post some of my own Powershell scripts where I have used some of what I learnt from this first script to run multiple SQL scripts from files and output the results to csv, run multiple commands at once in parallel (great for SQL Server) and one I use to move downloaded content from a download directory to various folders based on the content.
One of the earliest pieces of tried was to sync two folders. The reason I wanted to do this is because I keep a lot of our photos etc. in the cloud for safe keeping. But after reading of others having hackers get into their cloud storage and wipe it all I thought it would be a good idea to keep an onsite backup of the offsite backup.
So using either Google Drive, Drop box or Skydive where you can sync the files back to your computer is one thing I use and then I take a backup of that. Thats where I thought it would be fun to have a play with powershell to sync those folders to a backup directory.
So to start off with like any good developer I went and googled to see who has done this already and I found a good post by Steven Murawski which does it all for me.
Now to call this script I created a command file in notepad which included the below
powershell "{Location of script file}\Sync.ps1 {Location of Source} {Location of Sync Folder}" PAUSEExample:
powershell "C:\Scripts\Sync.ps1 C:\Pictures D:\PicturesBackup" PAUSEBelow is the actual Powershell script from Steven Murawski
# Requires -Version 2 # Also depends on having the Microsoft Sync Framework 2.0 SDK or Runtime a # --SDK-- # http://www.microsoft.com/downloads/details.aspx?FamilyID=89adbb1e-53ff-41b5-ba17-8e43a2e66254&displaylang=en # --Runtime-- # http://www.microsoft.com/downloads/details.aspx?FamilyId=109DB36E-CDD0-4514-9FB5-B77D9CEA37F6&displaylang=en # # [CmdletBinding(SupportsShouldProcess=$true)] param ( [Parameter(Position=1, Mandatory=$true, ValueFromPipelineByPropertyName=$true)] [Alias('FullName', 'Path')] [string]$SourcePath , [Parameter(Position=2, Mandatory=$true)] [string]$DestinationPath , [Parameter(Position=3)] [string[]]$FileNameFilter , [Parameter(Position=4)] [string[]]$SubdirectoryNameFilter ) <# .Synopsis Synchronizes to directory trees .Description Examines two directory structures (SourcePath and DestinationPath) and uses the Microsoft Sync Framework File System Provider to synchronize them. .Example An example of using the command #> begin { [reflection.assembly]::LoadWithPartialName('Microsoft.Synchronization') | Out-Null [reflection.assembly]::LoadWithPartialName('Microsoft.Synchronization.Files') | Out-Null function Get-FileSystemChange() { param ($path, $filter, $options) try { $provider = new-object Microsoft.Synchronization.Files.FileSyncProvider -ArgumentList $path, $filter, $options $provider.DetectChanges() } finally { if ($provider -ne $null) { $provider.Dispose() } } } function Invoke-OneWayFileSync() { param ($SourcePath, $DestinationPath, $Filter, $Options) $ApplyChangeJobs = @() $AppliedChangeJobs = @() try { # Scriptblocks to handle the events raised during synchronization, I believe used for reporting $AppliedChangeAction = { $argument = $event.SourceEventArgs switch ($argument.ChangeType) { { $argument.ChangeType -eq [Microsoft.Synchronization.Files.ChangeType]::Create } {[string[]]$global:FileSyncReport.Created += $argument.NewFilePath} { $argument.ChangeType -eq [Microsoft.Synchronization.Files.ChangeType]::Delete } {[string[]]$global:FileSyncReport.Deleted += $argument.OldFilePath} { $argument.ChangeType -eq [Microsoft.Synchronization.Files.ChangeType]::Update } {[string[]]$global:FileSyncReport.Updated += $argument.OldFilePath} { $argument.ChangeType -eq [Microsoft.Synchronization.Files.ChangeType]::Rename } {[string[]]$global:FileSyncReport.Renamed += $argument.OldFilePath} } } $SkippedChangeAction = { [string[]]$global:FileSyncReport.Skipped += $event.SourceEventArgs.CurrentFilePath if ($event.SourceEventArgs.Exception -ne $null) { Write-Error '[' + "$($event.SourceEventArgs.Exception.Message)" +']' } } # Create source provider and register change events for it $sourceProvider = New-Object Microsoft.Synchronization.Files.FileSyncProvider -ArgumentList $SourcePath, $filter, $options $AppliedChangeJobs += Register-ObjectEvent -InputObject $SourceProvider -EventName AppliedChange -Action $AppliedChangeAction $AppliedChangeJobs += Register-ObjectEvent -InputObject $SourceProvider -EventName SkippedChange -Action $SkippedChangeAction $ApplyChangeJobs += $SourceApplyChangeJob # Create destination provider and register change events for it $destinationProvider = New-Object Microsoft.Synchronization.Files.FileSyncProvider -ArgumentList $DestinationPath, $filter, $options $AppliedChangeJobs += Register-ObjectEvent -InputObject $destinationProvider -EventName AppliedChange -Action $AppliedChangeAction $AppliedChangeJobs += Register-ObjectEvent -InputObject $destinationProvider -EventName SkippedChange -Action $SkippedChangeAction $ApplyChangeJobs += $DestApplyChangeJob # Use scriptblocks for the SyncCallbacks for conflicting items. $ItemConflictAction = { $event.SourceEventArgs.SetResolutionAction([Microsoft.Synchronization.ConflictResolutionAction]::SourceWins) [string[]]$global:FileSyncReport.Conflicted += $event.SourceEventArgs.DestinationChange.ItemId } $ItemConstraintAction = { $event.SourceEventArgs.SetResolutionAction([Microsoft.Synchronization.ConstraintConflictResolutionAction]::SourceWins) [string[]]$global:FileSyncReport.Constrained += $event.SourceEventArgs.DestinationChange.ItemId } #Configure the events for conflicts or constraints for the source and destination providers $destinationCallbacks = $destinationProvider.DestinationCallbacks $AppliedChangeJobs += Register-ObjectEvent -InputObject $destinationCallbacks -EventName ItemConflicting -Action $ItemConflictAction $AppliedChangeJobs += Register-ObjectEvent -InputObject $destinationCallbacks -EventName ItemConstraint -Action $ItemConstraintAction $sourceCallbacks = $SourceProvider.DestinationCallbacks $AppliedChangeJobs += Register-ObjectEvent -InputObject $sourceCallbacks -EventName ItemConflicting -Action $ItemConflictAction $AppliedChangeJobs += Register-ObjectEvent -InputObject $sourceCallbacks -EventName ItemConstraint -Action $ItemConstraintAction # Create the agent that will perform the file sync $agent = New-Object Microsoft.Synchronization.SyncOrchestrator $agent.LocalProvider = $sourceProvider $agent.RemoteProvider = $destinationProvider # Upload changes from the source to the destination. $agent.Direction = [Microsoft.Synchronization.SyncDirectionOrder]::Upload Write-Host "Synchronizing changes from $($sourceProvider.RootDirectoryPath) to replica: $($destinationProvider.RootDirectoryPath)" $agent.Synchronize(); } finally { # Release resources. if ($sourceProvider -ne $null) {$sourceProvider.Dispose()} if ($destinationProvider -ne $null) {$destinationProvider.Dispose()} } } # Set options for the synchronization session. In this case, options specify # that the application will explicitly call FileSyncProvider.DetectChanges, and # that items should be moved to the Recycle Bin instead of being permanently deleted. $options = [Microsoft.Synchronization.Files.FileSyncOptions]::ExplicitDetectChanges $options = $options -bor [Microsoft.Synchronization.Files.FileSyncOptions]::RecycleDeletedFiles $options = $options -bor [Microsoft.Synchronization.Files.FileSyncOptions]::RecyclePreviousFileOnUpdates $options = $options -bor [Microsoft.Synchronization.Files.FileSyncOptions]::RecycleConflictLoserFiles } process { $filter = New-Object Microsoft.Synchronization.Files.FileSyncScopeFilter if ($FileNameFilter.count -gt 0) { $FileNameFilter | ForEach-Object { $filter.FileNameExcludes.Add($_) } } if ($SubdirectoryNameFilter.count -gt 0) { $SubdirectoryNameFilter | ForEach-Object { $filter.SubdirectoryExcludes.Add($_) } } # Perform the detect changes operation on the two file locations Get-FileSystemChange $SourcePath $filter $options Get-FileSystemChange $DestinationPath $filter $options # Reporting Object - using the global scope so that it can be updated by the event scriptblocks. $global:FileSyncReport = New-Object PSObject | Select-Object SourceStats, DestinationStats, Created, Deleted, Overwritten, Renamed, Skipped, Conflicted, Constrained # We don't need to pass any filters here, since we are using the file detection that was previously completed. # this will only $global:FileSyncReport.SourceStats = Invoke-OneWayFileSync -SourcePath $SourcePath -DestinationPath $DestinationPath -Filter $null -Options $options #$global:FileSyncReport.DestinationStats = Invoke-OneWayFileSync -SourcePath $DestinationPath -DestinationPath $SourcePath -Filter $null -Options $options # Write result to pipeline Write-Output $global:FileSyncReport }
Subscribe to:
Posts (Atom)