Tag Archives: PowerShell

Automating SQL Express database backups

SQL Server Express is a fantastic, free database engine for small, standalone databases. However, the limitations of the Express database engine can make automating database backups tricky. Due to a lack of a SQL Server Agent, you will likely have to roll your own process for backing up Express databases. While there are enterprise database backup solutions, these solutions are typically licensed by server. Therefore, eating up a valuable license for a small, standalone Express database might be cost prohibitive. There is, however, an easy way to roll your own automated backup solution using a stored procedure, PowerShell, and a Windows scheduled task.

First, we need a stored procedure that accepts a parameter for the directory where we want to save backups. This stored procedure should back up every database except tempdb. Also, the stored procedure needs to be saved in a database that will never get deleted (e.g. master).

Here is a script to create the spBackupDatabases procedure in the master database…

Each database in the instance will save to the backup location specified by the path parameter. As written, the file name for each backup will be in the form _.bak (e.g. InventoryDB_201401072105.bak); however, you can change this to whatever naming convention you prefer. One important factor to note is that the directory you pass to the stored procedure to needs to allow the SQL Server service account “write” permissions. If you are running the SQL Express service as the system account, you will have to grant appropriate permissions to the backup directory for that system.

Let’s test the stored procedure to confirm everything is working. Be sure the destination folder already exists or you will get Sql ErrorNumber 3013.

Now that we have a stored procedure to backup all databases, we need to create a script that will run the stored procedure. As an added caveat, the script needs to delete the backup files that are older than 30 days. PowerShell is a perfect solution for this because it can natively interact with SQL Server using the .Net Framework. The following script will connect to the SQL Express instance, run the new stored procedure, and then “DELETE ALL FILES” that are over 30 days old. Note that the databases are backed-up using the SQL Server service account, but the old files are deleted using the account that is running the PowerShell script. Therefore, be sure both accounts can read and write to the backup directory.

You will note that the database connection string above is using “Integrated Security=True”. This is because I did not want to store the database connection user ID and password in the PowerShell script. Instead, I will schedule the PowerShell script via a Windows Scheduled Task with the credentials for a domain account saved in the Scheduled Task. This makes it harder for someone to obtain a user ID and password for the database. If you are using a SQL Account, and you are okay with putting the credentials in the connection string, then you can use the following…

Finally, the last step for automated SQL Express database backups is creating a Windows Scheduled task. The trick for running a PowerShell script from a Windows Scheduled Task is to specify “PowerShell” in the Program/Script section of the Edit Action setting. Then, in the “Add arguments” section, add the path to your saved PowerShell script.

PowerShellScheduledTaskNote… If you have a problem running the PowerShell script, you can add the “-noexit” switch to the “Add arguments” section so that the error text stays on-screen after the script runs. Just be sure to remove the “-noexit” switch after testing so that PowerShell closes after it finishes running.

Running Linux Commands from PowerShell.

In my lab, I occasionally need to automate maintenance tasks that involve Windows and Linux systems. For example, I need to backup Windows directories to a Linux-based NAS device, compress and decompress files, delete old backups, etc. Sometimes, what I need to do is run SSH commands from PowerShell in a dynamic way. I found some examples online but they only ran one command at a time. For me, it would be better if I could dynamically create a set of commands; then have those all run consecutively in one SSH call.

To do this, first you need to define the statements you want to run in an array. In my case, I wanted something dynamic so I came up with the following.

Basically, the above commands will display the Linux distribution release info, change the working directory, print the working directory, unzip a file, and then remove the zip file. Note the “;” after each command is required. Alternatively, you can use “and list” (&&) or “or list” (||) instead of “;” if you understand how they work.

Now that I have the SSH commands that I want to run, how do I pass them to Linux? Manually, when I want to remotely connect to Linux in an interactive way, I use PuTTY. However, by itself, PuTTY doesn’t have a Windows command-line interface. Thankfully, the makers of PuTTY released Plink, aka “PuTTY Link”, which is a command-line connection tool for PuTTY. Armed with this new information, I downloaded Plink to the same directory as PuTTY and added an alias to my PowerShell script.

Now that I have an alias for Plink, I can pass my array of SSH commands directly to my Linux machine in one line of code.

One thing that is nice about this approach, the output of the SSH commands are displayed in the PowerShell console. That way, you can see if any Linux-based warnings or errors occur.

In the above example, I’ve added my user name and password as parameters in the command-line. Obviously, in a production environment this is not desirable. You can get around this by using public keys for SSH authentication. For more information, check out PuTTY’s help documentation. At the time of this writing, Chapter 8 covered how to set up public keys for SSH authentication.

Here is the finished script.

Some notes worth sharing… Initially, my instinct told me that zipping a large directory locally on the NAS device would be faster than trying to remotely zip the files from my Windows PC. I assumed the network overhead of downloading the files and then uploading the compressed archive back to the NAS would be a bottleneck. In fact, in my case, it was faster to do it remotely from Windows. This is because the limited RAM and CPU for my consumer grade NAS device were quickly overwhelmed by the compression task. My Windows box, with a dual core CPU, 4GB RAM, a Gigabit NIC, and an SSD could compress the files faster than the NAS device despite having to send the data over the network both ways. Some tasks, such as deleting large directories were significantly faster when ran locally on the NAS. Therefore, you will have to experiment to find out what works best for you.

Backup VMs with PowerShell and PowerCLI.

PowerCLIIn my lab, I back up my VMs by exporting OVA templates and saving the files on a NAS device. My typical routine involves powering down the VM, exporting an OVA template, waiting about 30 minutes for the export to complete, powering it back up, rinse and repeat. After a few backup iterations, I found this process to be a chore that I occasionally put off. Today, I decided to automate the task and pay down some principle on that technical debt.

I’m fairly comfortable in PowerShell and some time ago I downloaded VMware’s PowerCLI. I decided to give it another test drive and knock out a real world problem. The first hurdle was how to load the modules. The PowerCLI loads via a snap-in. Therefore, we can add the snap-in and view the available cmdlets thusly.

PowerShellPowerCLI

Okay, great! We have several hundred cmdlets and they have fairly decent naming conventions. Now let’s repro my manual workflow in code. First, I need to connect to a host and get a list of all my VM’s.

PowerShellPowerCLI2

Next, I need to power down running VMs and bring them back online after backing them up.

The last thing I need is to export the OVA to my NAS device. The command we will need is Export-VApp. (I prefer to power down the VM before running this.)

Finally, let’s pull it all together and backup all the VMs on the host. This is the completed script.

While it is possible to use Connect-VIServer without having to manually enter credentials, I’ll leave that step up to you. Google “PowerShell credentials SecureString” for more information.

Ideas for the next step? Addressing the possible infinite loop would be wise if you don’t want to periodically check your backup job. Backing up the ESX Host OS and configuration is important especially if your environment is complex. Adding some logging for backup date, VM size, and other metrics would be easy and valuable. You could send “Successful” or “Failed” email alerts if you’re okay with a little more spam. Let me know what you come up with.

Try backing up your VMs today. Tomorrow you might be glad you did.