When I am playing with my almost 2 year old son, he loves to have me do the same thing over and over again. For instance, he really loves playing with his toy car that I pull back and release. And after it stops, he goes and gets the car and we repeat this process again and again. And oddly enough, we both have a lot of fun playing with this toy car.
However, at work, when someone asks me to do the same thing over and over again, it is not fun. In fact, generally when someone asks me the second or third time to do the same thing, I am usually looking for a way to automate it. One common request of a DBA that is a good candidate to automate is when a request comes to restore a database from a production environment to some kind of test/qa/uat environment because the developers or users need updated data.
So, after being asked to do this type of task multiple times, I decided to automate this process using PowerShell. And since I like to think of things logically, I separated the process into a few basic tasks:
- Get the newest backup file in directory
- Find the default file locations of the SQL Server instance
- Read the contents of the backup file
- Restore the backup file to the SQL Server
- Run scripts to adjust database
In this post, I am going to focus on a few parts of the script that I believe are important and explain what I am doing in each step. Some steps I will explain using PowerShell and others with T-SQL. I am imagining that it should be easy enough for most people to convert the T-SQL steps to PowerShell.
Get the newest backup file in directory
As you can see from the list, the first thing I needed to do was to find the last backup in the directory. Since we append datetime stamps on our backup file names, it was not as easy as just referring to a file name that would always be the same (i.e. DatabaseName_201302221000.bak versus DatabaseName.bak). However, this was actually an incredibly easy step. Initially, I thought I would have to loop through a list of files, but PowerShell makes it so easy.
Get-ChildItem FileSystem::$directory | sort LastWriteTime | select -Last 1
Find the default file locations of the SQL Server instance
As you may or may not know, SQL Server stores the database default file locations in the registry. So, we will have to read the registry if we are to obtain that information. Now, this may seem more daunting of a task than it really is. There is an undocumented stored procedure, called xp_instance_regread, which will come in handy in reading the registry. (Note: In my script, I also added parameters so that I could specify what directories I wanted to use in case I did not want to use the default locations.)
EXEC master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultData'
EXEC master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultLog'
Read the contents of the backup file
Next, we have to read the contents of the backup file to determine the amount and type (i.e. primary data file, secondary data files, or log file(s)) of database files in the backup. For this, we will use the command, RESTORE FILELISTONLY (http://msdn.microsoft.com/en-us/library/ms173778.aspx), which will return a table.
RESTORE FILELISTONLY FROM DISK = '\\FilePath\DatabaseName_201302221000.bak'
Restore the backup file to the SQL Server
Now, let’s get down to business. Here is an extremely simplified version of what you need to do, by using the RESTORE DATABASE (http://msdn.microsoft.com/en-us/library/ms186858.aspx) command with the MOVE option.
RESTORE DATABASE [DatabaseName] FROM DISK = '\\FilePath\DatabaseName_201302221000.bak' WITH FILE = 1, RECOVERY,
MOVE 'LogicalDataFile' TO 'D:\Data\DatabaseName.mdf',
MOVE 'LogicalLogFile' TO 'L:\Log\DatabaseName_log.ldf'
Run scripts to adjust database
Typically, after doing a database restore, there may be certain things you need to do that can easily be scripted out. For instance, perhaps you need to change the database owner to “sa”, add or adjust some security settings, change the recovery model, update configuration data for the test environment, scramble data, etc. This kind of stuff you can just tack on to the end of the script or have additional scripts that are run separately.
So, if someone asks you to do something again and again … think about automating it!