Been a busy week at work and I’ve had to perform multiple dumps of production databases back to a development server. This is something that would normally be pretty quick and easy however a couple of things impacted that and led to having to make some choices about how to perform the task.
Here are the relevant details that I need to use to decide how to move the backup files:
- The servers are in two different datacenters and the WAN link between them is not very fast
- The backups are stored on a CIFS share that is non-routable for anything other than machines on that VLAN (for security reasons)
- There is only a single machine in the datacenter that has the ability to reach through the firewall to the development server
- I had to get the data across the WAN as fast as possible, however did not want to have to babysit what would be a several hour process (it should restarted in the case of any issues)
- All the hosts involved are running Windows 2008R2 SP1
- I did not want to install any extra software on any of the servers
- This is not an ongoing process and so I wanted something quick. No need to try and over-engineer the situation
How to get at the backup
My first thought was to use the improved TCP stack in Windows 2008R2, log on to the development server and pull the data from the CIFS share. Unfortunately this was out as the development machine does not have access to the subnet in order to pull the data.
Next was to perform a backup directly across the WAN to the development machine. This would not work due to firewall restrictions preventing communication between the machines (and this would also not be network optimized)
The only option left was to use the single machine in the datacenter (let’s call this the jump server), on the relevant VLAN, which had access through the firewall to push the backup to the development box.
How to optimize the copy and have it restartable
Opening a terminal server session to the jump server, opening an explorer window to the source and destination, then dragging the file over would copy the file. There have also been reported incidents where using the drag and drop method has caused memory problems (here’s an example from a forum thread http://social.technet.microsoft.com/Forums/en-US/winservergen/thread/3f8a80fd-914b-4fe7-8c93-b06787b03662/ ). This also does not address the concern with automatic restarts for the copy operation.
Opening a terminal server session to the jump server and running copy/xcopy would do the trick. There are no memory issues that I’m aware of with this method. We do have the problem whereby the copy process would not restart in the event of a problem. This method also does not do the best job of managing copies over a network. So this was out.
Opening a terminal server session to the jump server and using robocopy to copy the file made the most sense. Using the additional /r and /w switches you can have the copy process retry as many times as needed and have a defined interval between those retries. Robocopy also has a level of network optimization built in. As a bonus it also provides ongoing information as to the copy completion percentage and details statistics when complete. Robocopy is built in to Windows 2008R2 and so also meets the requirement of not wanting to install any more software.
Final copy solution
In the end I found myself opening up a terminal server remote desktop session to the jump server which had both a route to the backup file and access through the firewall to the development server. From there I used Robocopy to perform the file copy over the WAN, providing the best performance as well as automatic restarts from any failures.
I’m curious, would you have handled this differently?