Formatting Number Output

I’ve been working with some large data load processes recently and have been dumping upwards of a billion records into tables. As these are not instantaneous loads I found myself wanting to keep an eye on how much data was loaded.

This is nice and easy to do using the sys.partitions system view. Here’s an example from AdventureWorks

SELECT t.name, rows

FROM sys.partitions p

   INNER JOIN sys.tables t

       ON p.OBJECT_ID = t.OBJECT_ID

WHERE p.index_id IN (0,1)

ORDER BY rows DESC;

top+10

This is all good and well, but when you start getting to large values on the tables it becomes a little difficult to tell the difference between 31 million and 312 million. That’s when I resort back to using commas. Unfortunately SQL Server does not make this easy, although it can be accomplished with a couple of converts.

SELECT t.name, substring(convert(varchar, convert(money, rows), 1), 1, len(convert(varchar, convert(money, rows), 1))-3)

FROM sys.partitions p

   INNER JOIN sys.tables t

       ON p.OBJECT_ID = t.OBJECT_ID

WHERE p.index_id IN (0,1)

ORDER BY rows DESC

top+10+2

This is still a little arduous to type, so I created a function to do the work. Now it’s just a case of calling that function whenever I want to have the numbers presented to me in a much more readable format.

CREATE FUNCTION dbo.fnNumberComma (@iVal bigint)

RETURNS VARCHAR(30)

AS

BEGIN

DECLARE @oVal VARCHAR(30)


SELECT @oVal = SUBSTRING(CONVERT(VARCHAR(30), CONVERT(money, @iVal),1),1,LEN(CONVERT(VARCHAR(30),CONVERT(money,@iVal),1))-3)

RETURN (@oVal)

END

GO



SELECT t.name, dbo.fnNumberComma(rows) as [RowCount]

FROM sys.partitions p

   INNER JOIN sys.tables t

       ON p.OBJECT_ID = t.OBJECT_ID

WHERE p.index_id IN (0,1)

ORDER BY rows DESC;

top+10+3

Going Into Standby Using Powershell

Since moving to Windows 8 on my primary laptop I have found myself using PowerShell to perform shutdown and restart tasks. For me hitting the charms and going through the options to do a simple shutdown was just way too much work, especially when I could just hit the PowerShell icon and type stop-computer or restart-computer. 

One annoyance I found was that there was no pause-computer or equivalent to put the machine into standby, something very useful for a laptop.

Given that this was something I wanted to do I put something a new function in my profile to give me that ability. Now I can just type pause-computer and it will go into standby. Script below…

Function Pause-Computer 

    {

        Write-Output "Going to standby..."

        Start-Sleep -Seconds 1

        &"$env:SystemRootSystem32rundll32.exe" powrprof.dll,SetSuspendState Standby

    }

Developing To Specifications

I’m a DBA. As a class of people you will find that DBAs have a tendency to rant a little about developers. I would certainly be someone that you would find in that category. The trouble is that most of the time I don’t think that it is the developers fault, it is just a case of shooting the messenger.

As an example let’s look at a new database release that was being asked.

The biggest issue was that the code was provided so late in the cycle that the issues found (and there were a great many of them) could not be fixed. Why could they not be fixed? Because an arbitrary release data had been provided to the customer and this data could not slip for any reason whatsoever. Artificial deadlines, one of the worst things that devs and DBAs have to deal with.

The developers agreed to get the code fixes added to their backlog and to get it into a sprint for fixing in the next month. So after much discussion and a firm commitment we decided to move ahead with the release.

My next question to the dev team was “how are you going to get the data out? You have a few procs here for loading data into tables, but nothing for being able to consume that data afterwards.”

The response was a little stunning:

The only requirement to get done by is to have the data written to a database. After this deploy, we are going to create a way to get the data out

Outstanding. Way to develop to requirements.

In this instance I cannot really place blame on the dev team, they are just doing what is asked of them. I think the bigger problem is sitting with the folks who are gathering the business requirements and translating those, along with timelines, up to the business.

I think that it might be time for DBAs to stop pointing fingers at devs and start holding accountable those who are causing these problems, namely the Business Analysts and Project Managers.

Who’s with me on this?

Datatypes and xp_fileexist

I was using xp_fileexist to perform a couple of file and folder checks and ran across one of the things that you have to be careful of when using undocumented extended stored procedures. 

In this case when I was trying to insert the results into a table variable I was getting an error:

Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should be discarded.

That is particularly unhelpful. You can reproduce this error yourself by running:


DECLARE @Folder NVARCHAR(MAX) = ‘C:Backup’
DECLARE @FolderCheck TABLE
    
(
      
FileExists BIT ,
      
FileIsDir BIT ,
      
ParentDirExists BIT
    
);

INSERT  INTO @FolderCheck
        
( FileExists ,
          
FileIsDir ,
          
ParentDirExists
            
        
)
        
EXEC MASTER.dbo.xp_fileexist @Folder;
SELECT  FileExists ,
        
FileIsDir ,
        
ParentDirExists
FROM    @FolderCheck;

In a basic troubleshooting effort I tried to run the xp_fileexist without the insert into the table variable and got another error, this one being a little more helpful at first appearance.

DECLARE @BackupDirectory NVARCHAR(MAX) = ‘C:Backup’;
EXEC MASTER.dbo.xp_fileexist @BackupDirectory;

Msg 22027, Level 15, State 1, Line 0
Usage: EXECUTE xp_fileexist [, OUTPUT]

I knew that I was passing in the variable and it was valid, this just didn’t make a lot of sense. I decided to change the datatype of the variable in the hopes that it would work:

DECLARE @BackupDirectory NVARCHAR(8000) = ‘c:BackupNewFolder’;
EXEC MASTER.dbo.xp_fileexist @BackupDirectory;

Msg 2717, Level 16, State 2, Line 2
The size (8000) given to the parameter ‘@BackupDirectory’ exceeds the maximum allowed (4000).

So, finally an error that gave me actionable information. By changing the length of the variable to a nvarchar(4000) I was able to get the results I was looking for.

This is just one of those examples where you have to be careful what you are playing with and play around to get what you are looking for. Undocumented xp’s can be really useful, but troubleshooting them can also be a nightmare.

How To Log Off of Server 2012

I’m finding little things that are petty annoyances when working with Windows Server 2012.

NoLogoffOption

The most recent one of these is the lack of a logoff option when connected to a server via a remote desktop session. When using the charm you can get the option to Disconnect, Shut down or Restart the server, but not log off.

It turns out that the solution is pretty simple, but not intuitive.

Just open up PowerShell and type logoff.

SCCM & SQL Server – A DBAs Worst Nightmare

Microsoft put out some great products, no really, they do. There are any number of applications and tools available for you to be able to do pretty much anything. One thing gaining popularity recently is System Center Configuration Manager (SCCM) which can be used to provide patch management, software distribution, inventory management, server provisioning and more.

Nightmare+on+SCOM+street Freddy & SCCM – a nightmare double feature

SCCM is great for businesses that are growing and need to maintain control over the devices used and maintain compliance across the enterprise.

This is all great. Businesses use it, businesses need it. SCCM has been designed to provide a relatively straightforward deployment that does not require any strong level of expertise. This is where SCCM falls down for me, as a DBA.

What is the problem?

SCCM does its own database management. It is a set it and forget it kind of thing. This is done so that an enterprise without SQL Server DBAs can go ahead and perform the deployment and management with any specialist knowledge.

This is all good and well, except when you do have a SQL Server DBA on staff; you have multiple deployments of SQL; and you like to perform consolidate servers wherever possible.

SCCM does some things which go completely against my wishes as a production DBA:

  • Requires sysadmin on SQL Server to both install and run the application
  • Requires Windows admin rights on the SQL Server
  • Installs software on Windows to perform backups of SQL Server
  • Adjusts SQL Server configuration settings (CLR & max text repl size)
  • Enables the TRUSTWORTHY option for the SCCM database
  • Sets the database recovery model to SIMPLE

Fortunately I found a lot of this information up front and decided that there was no way I was going to try and consolidate this database with any other in my environment. The security model is lacking in the worst fashion, and there is not much worse than taking all control away from a DBA.

I was glad that I made this choice as the SCCM decided to restart SQL as a part of the installation process. That would have caused a production outage if I had attempt to co-locate it with other low used databases.

Short recommendation

Being brief….if your sysadmins are looking to deploy SCCM in your environment, ask for a dedicated VM for SQL Server. Any attempt to consolidate this database will leave you open to massive security holes and production outages.

Windows Hotfix KB 2661254 Breaks Reporting Services

I have spent the last 3 weeks trying to troubleshoot an issue with Reporting Services for SQL Server 2008 R2 Service Pack 2 failing to start on a server and have come to discover that a Windows Hotfix is causing the issue.

There is no distinction between trying to install a slipstreamed version of SQL Server 2008 R2 with SP2 or trying to install SQL Server 2008 R2 and then attempting to apply SP2 on top of it, either way if KB 2661254 is installed the Reporting Services service will fail to start. You will not get an error indicating the reason for the failure, just that it failed (way to go with the pertinent error messages there Microsoft).

The Windows hotfix KB 2661254 is an update for the certificate minimum key length to prevent the use of any certificate keys that are less than 1024 bit long. This is a security measure to help prevent brute force attacks against private keys. Why this breaks SSRS I do not know. The patch can be safely applied to systems running SQL Server 2008 R2 SP1. 

For now I have passed along word to the sysadmins to not deploy this particular patch to any Windows machine that runs SQL Server and have created a Microsoft Connect item in a hope that they provide resolution to the issue. Please try this in your own test environment, then upvote and mark that you are able to reproduce the problem on Connect.

Automated Backup Tuning

Tuning your backups is a wonderful thing. You could easily reduce the time it takes to backup your databases by 50% of more just by changing a few settings, and yet so few people do it. I asked myself the question why and came up with two answers.

  1. People do not know how to tune their backups.
  2. It takes too long to run through the tuning process.

How to tune your backups

I’m not going to go over methods for tuning your backups in this post. There are people who have done a far better job at both explaining the adjustments you can make and that have created videos to help you understand and perform the process yourself.

My biggest concern was directed at the level of effort required to test all the possible permutations of files, maxtransfersize and buffercount values, after all, who has time to figure all of that out and then capture the data to look at the relative performance characteristics of each one?

I decided that the best way to do this was to create a nice little test harness which would run through all those tests without manual intervention, and then figure out a way to get the output from all of those backups into some kind of meaningful graph for that instant visual on performance differences.

NobodyGotTimeForThat
No manual backup tuning for this DBA

The backup testing script

Step one in the automated tuning is a SQL script I created which accepts a few input variables:

  • @DatabaseName – name of the database you want to use for testing
  • @MaxBackupFiles – maximum number of files you want to write to at a time
  • @BackupFolder – destination folder for the backups
  • @WithCompression – whether or not to use SQL Server backup compression
  • @CreateJob – Whether or not to create a SQL Agent job to run the tests
  • @JobLogFileName – file path and name for a log for the job
  • @OutputToScreen – outputs the backup commands to the screen

When executed the script is designed to created SQL Agent job which you can then execute at a time of your choosing. The job will run through a series of backups for a database (at minimum 40) and capture the information in the job log file.

Be warned, performing this many backups test, with so many settings, can and will impact your server performance at some point. Be aware of this. Do not cause issues for your users when performing the tests, they will not like you for doing so.

Visualizing the results

Once the backup testing job is complete a job log file is left behind. This log file contains all of the information for each of the backups performed including the:

  • number of files backed up to
  • maxtranfersize setting used
  • buffercount setting used
  • speed of the backup in MB/sec
  • time the backup took to complete in seconds

Getting that information usable is done using a PowerShell script. A blog post by Jonathan Kehayis described his method for parsing SQLIO output to excel charts. Based on this script, the update version parses the backup log file that has been created, imports the information into excel and creates a couple of charts.

These charts show the speed (Figure 1) and backup time (Figure 2) for the default backup options, the absolute fastest backup to NUL, and the all of the other options that were tested.

Backup+Speed
Figure 1. Backup speed results for 100GB database backed up across 10Gb network to CIFS share

Backup+Time
Figure 2. Backup time results for 100GB database backed up across 10Gb network to CIFS share

The graphs provide a nice visual representation of the backup performance which helps to quickly identify that there are improvements that can be made.

The default SQL backup runs at just over 200MB/sec and takes about 235 seconds to complete, but we know that we can perform that same backup to a NUL device in about 75 seconds at a rate of over 1200MB/sec.

Just looking through the rest of the data it is clear to see that a lot of the values used make a negligible change to the performance of the backups, so we can work to make an informed decision on what the best file, maxtransfersize and buffercount settings should be.

Hopefully this is something that you can find useful. Please give it a try and send me feedback on your testing efforts.

Downloads: Backup Test Harness.sqlParse Backup Perf Tests.ps1

Is Technical Writing Important?

Many Colleges and Universities now offer certificates in technical writing. As a DBA the idea of technical writing important to me (and after all I do have a blog to maintain). Technical writing can have quite a lot of scope though. It can encompass:

  • Writing instruction manuals
  • Building reports
  • Creating graphs and charts
  • Web layout
  • Font choices
  • Working with various tools
  • Analyzing data
  • Building proposals
  • Plus lots more

I think that a lot of technical writing skill comes along as a natural progression of working in the IT business (at least for most people), but what about those folks that are new out of college? Do you feel that new graduates should have a feel for technical writing? Would a certificate in technical writing be beneficial in helping graduates get jobs? Would someone with a certificate be more likely to get an interview with your company?

These are all important questions and I am looking for you, the good reader, to assist me in gathering some data.

Please, spend a couple of minutes and fill out the survey below or the version over at SurveyMonkey. I will share the results once the survey is complete.

Create your free online surveys with SurveyMonkey, the world’s leading questionnaire tool.

How To Securely Access Your Windows Home Computer–pt2

In part one I showed you how to setup Bitvise to allow you to make a secure connection to a computer over SSH. That is a great start but now you need to be able to access that machine from anywhere. You could use the IP address of the computer if it was directly connected to your cable/DSL modem, but the chances are that you have everything sitting behind a router and so there is a little extra configuration that you need to do.

 

Configuring your router to forward port requests

The process will differ for various routers but the same process needs to be performed whatever the model. You will need to setup port forwarding to the computer on your home network that Bitvise is configured and listening on.

I’m using a Cisco router and that’s what I will use in this example.

  • Firstly, grab the IP address of your computer by running ipconfig. Keep this handy.
  • Now login to your router and look for a setting for port forwarding. On my router that is under Applications & Gaming.
  • Add a new external port on 22 and an internal port 22 and set the IP address to the one you grabbed before
  • Enable the port and save the settings

Below you can see how I have Bitvise on port 22 forwarded to the internal IP 192.168.1.127

image

Hint: if you have multiple home machines that you would like to connect to you could setup different external ports that point to port 22 on different internal IP addresses

 

While you have the router options open go to the status page and note your Internet IP Address. This is the IP that you will use for connecting over the internet (it is possible, for a small cost, to use a dynamic dns service to manage this for you, that’s an optional step I will cover later).

 

Testing your Bitvise connection over the internet

As in part 1, open up your Bitvise client on another machine. This time use the Internet IP Address you captured earlier as the Host entry.

Your connection should work just as before. If it does not work double check the Internet IP Address and ensure that you enabled the port on your router.

 

Using Remote Desktop over SSH

Now that everything is setup with Bitvise and you can make a SSH connection over the internet it is time to get Remote Desktop up and running.

Open up Control Panel – System and Security – System – Advanced system settings and set your computer to allow Remote Desktop connections. By default administrators will be allowed access, if you want to grant access to another user just hit the Select Users… button and add them.

image

 

By default Remote Desktop will listen on port 3389, unless you are comfortable messing around with this stuff I do not recommend changing it.

Now open up your Bitvise client and go over to the C2S tab to enable Client to Server Port Forwarding. Here you will add a local port for listening and the remote port that you home computer is listening on.

Note: your listening port cannot be 3389 if you already have your client computer setup for Remote Desktop connections

Here I like to use port 13389 for the listener port and we set the destination port to be 3389 for the Remote Desktop on the remote computer side.

image

 

Now connect again and once the connection is made open up Remote Desktop and use 127.0.0.1:13389

image

 

Now your connection should be made and you are good to go. Remote access to a home Windows machine from anywhere.

 

Shortcut – Opening Remote Desktop

If you don’t want to go through the hassle of opening Remote Desktop every time why not let Bitvise take care of that for you? On the Options screen under On Login you can check the box for Open Remote Desktop (and uncheck the ones for Open SFTP and Open Terminal) so that whenever you connect it will start up that Remote Desktop session for you and connect.

image

 

Shortcut – Keeping track of your external IP address

Knowing your external (Internet) IP address is one thing, but what happens if your provider goes and changes that IP on you when you are out of town? Well you are out of luck.

If you get concerned that this might happen to you I highly recommend going out to DynDns.org and signing up for their $20 a year DynDNS Pro service. This fantastic service integrates with most routers. You enter your DynDns credentials on the router and it updates the DynDNS service with your current IP address. They provide you with a standard domain name that you can use. That makes life a great deal easier and can be used to set your mind at ease. Even if your router does not support this they also have a client side option that you can load on to the computer which will communicate your external IP back to them. Not bad for $20, and it’s the only thing in this solution that costs you anything.

 

Quick summary

  • Download and install Bitvise SSH Server on your home computer
    • Add the required users to Bitvise
    • Enable Remote Desktop access
    • Add port forwarding to your router
    • Capture your external IP address
    • Get a DynDNS account so you can connect using a hostname (optional)
  • Download and install Bitvise Client on your laptop
    • Add a C2S entry so a local port gets mapped to a different remote port
    • Enter the host and login
    • Open up Remote Desktop and enjoy a secure working connection to home

There really are just a few simple steps to completing this process. It seems daunting at first but Bitvise really makes things easy.

I suspect that any good SSH client like Putty will allow you to do the self same things the Bitvise client does, I have just not tested that yet.

 

Give this a try yourself. It will make a good weekend project. Let me know how it works out for you.

I have also created a downloadable PDF document so you can have the information in one place.