Category: Uncategorized

Un-SQL Friday–Tech Giants

SELECT DATEADD(d,-3,getdate()) AS PostDate

 

I got off to a late start on this one. Life, stuff and things kind of got in the way of my getting to write and post to the newest Un-SQL Friday topic started by Jen McCown (blog|twitter).

Jen asks “Who are the IT giants you talk to/read that make you feel like a technical poser?” and instructs that invoking Paul Randal (blog|twitter) is not allowed. Per Jen’s instructions saying everybody is also not acceptable. As such I had to sit down and figure out who I could add on the list.

 

My First Thoughts

When Jen originally mentioned this on Twitter I came up with a few names on the spot

 

That’s a pretty good list right there and one worth sticking with.

 

Who Are These Folks?

In no particular order:

  • Jonathan Kehayias (blog|twitter) – A SQL Server MVP I’ve found Jonathan’s posts always interesting and yet half the time I find myself going “wait, what?” and then after re-reading going “oh duh”. If you’ve not read them go to his blog and read his series 31 Days of Extended Events
  • Aaron Bertrand (blog|twitter) – Another SQL Server MVP, Aaron is a senior consultant for SQL Sentry. He frequently posts on various items, but what really makes him stand out to me is his being able to quickly get a handle on new features that are available. He has already written a whole host of entries about SQL Denali
  • Adam Machanic (blog|twitter) – Yet another SQL Server MVP (I’m seeing a trend here) Adam would easily make this list for sp_whoisactive alone, but that would do a disservice to all of the other performance and parallelism things he posts (not to mention being the brains behind T-SQL Tuesday).
  • Buck Woody (blog|twitter) – Not an MVP is Buck, however I believe the only reason for that is that he works at Microsoft. Buck is first and foremost a teacher. He officially teaches at the University of Washington but his love of imparting knowledge rings through in every post. Spend a week reading his Carpe Datum  blog, you’ll be better off for it.
  • Glenn Berry (blog|twitter) – Oh hey, a SQL Server MVP. Glenn is all about performance, be it from tuning your SQL to make it faster to getting the right hardware at the right price. Glenn is so passionate about the hardware side of things that he has a book coming out next month just about SQL Server Hardware. I used to think I knew my hardware up until I started reading Glenn’s blog.
  • Sean McCown (blog|twitter) – He’s a SQL Server MVP and a prolific writer. Funnily I was reading Sean’s stuff years ago and didn’t realize it was him until I was looking for a specific article recently. Sean has an education mission. He writes, blogs, creates videos, does webcasts, presents and cooks. Sean can take something complex and really break it down well so it’s understandable. Wish I had half of his ability to do that.
  • Grant Fritchey (blog|twitter) – Grant recently announced he was going to go work for Red Gate Software, a great pickup adding a SQL Server MVP to their ranks. Query performance is where I find Grant to be the go to guy. He’s even written books about SQL Server Execution Plans and SQL Server Query Performance Tuning that I would highly recommend to anyone, beginner to expert.
  • Don Jones (blog|twitter) – One more to add to the MVP list here. Don has written a ridiculous number of books and articles about a wide range of subjects, but it’s his blogs on PowerShell that really stand out to me. Check out his book Learn PowerShell in a Month of Lunches and make take in more than just calories.
  • Conor Cunningham (blog) – Conor isn’t as prolific a writer as many of the other folks listed here however when he writes something you are darn well going to read. His Conor vs series isn’t the first that I go to in my RSS reader just because I know if there is something new it’s going to take me a while to read and digest it. I was really disappointed to have missed Conor speaking at the PASS Summit 2010.
  • Paul White (blog|twitter) – Paul happily states that he’s not an MCM, MVP, MCITP or MCTS. I can categorically state that Paul’s blog is probably the best constructed and well written I’ve read on SQL Internals. Just go and read some of his writings where he dissects the optimiser. I dare you not to be impressed and amazed.
  • Linchi Shea (blog) – I really don’t know anything about Linchi. There’s no about page on his blog. The only thing I know is that Linchi needs to write more frequently as I love the way that the blog posts pull apart items and really get to the root information.

 

Who Else?

I could easily write another 15-20 names here. I currently have 65 SQL bloggers in my RSS reader. Each one of them at one time or another has made me feel like a tech poser.

It’s actually great to have all these folks out there writing their stuff as any time I think about getting complacent in one area or another I just have to pull up a random post from anyone, read it and say to myself “you’ve got a lot to learn yet kid”.

T-SQL Tuesday #14–Resolutions

Newly minted MVP Jen McCown (blog|twitter) of MidnightDBA fame is this months T-SQL Tuesday host. Jen wants us to talk about resolutions. My personal laptop is a 1600 x 900. Oh wait, that’s a lousy play on words.

I don’t actually do the whole resolution thing. Sure the day ticks over and the year changes from one to the next but it’s just another day. I know a lot of people have strong feelings about the New Year and potential for new beginnings, I’m just not one of those people.

That being said I do have some goals that I aim to accomplish within the next few months.

 

Certification

I completed my MCITP: Database Administrator 2008 (exams 70-432 and 70-450) last year and have been figuring that I should look at taking the MCITP: Database Development 2008 (exams 70-433 and 70-451). After all I’ve been doing this stuff for a few years now so it shouldn’t be too bad, right? Anyhow, thanks to Microsoft offering their Second Shot program I decided to just go ahead and do so. My first exam is scheduled for next month. Depending on how that one works out I’ll try for the other in July.

 

Presenting

I spoke at my first SQLSaturday in Salt Lake City last year. I really enjoyed it and want to present to at least four more events this year. I’ve already submitted a Powershell session for SQL Saturday #67 in Chicago, and hope to get chosen (there are a lot of submitted sessions already with some amazing speakers so if I don’t get picked there won’t be any disappointment on my part). Unfortunately I’ll be on vacation for the local event in Olympia so will be looking to see what other travel excursions I can take.

I also made the decision that I would submit a session to the PASS Summit this year. I think the odds on my getting selected to present there are even less than they are at SQL Saturday #67 however I won’t be letting that deter me.

 

Blogging

I’ve certainly picked this up more in the last few months after a very wayward start. My posts usually take at least a couple of hours to put together, mainly because I am trying to do some knowledge imparting. Rather than just provide a script I try to explain the methodology behind it in the hopes that it will help you, the reader, to take that and expand upon what I’ve done.

After reading Paul Randal’s (blog|twitter) post the other day “So you want to blog about SQL Server?” I’ve decided I’m going to intersperse some shorter posts alongside my rambling ones just so that I get some regular content out there.

 

Personal

I really have one key personal goal to achieve in the next few months. I’ve been living in the US for about 12 years now. I’m not going to be moving back to England, I have far too many ties here, as such I need to knuckle down and get my citizenship sorted out. That is really my number one thing and takes priority over all the others. Fortunately there’s not much more to do than fill in a few forms, pay a bunch of money and wait to take the Naturalization Test (link goes to a self test on the USCIS site, take it and see how well you do). The test really should be easy to pass but I am going to be studying the heck out of it just to be doubly sure that I do. Not something I want to mess up. 

 

Once all these things are over and done with I think I’ll just go ahead and take the MCM exam. You know, just to pass the time…

T-SQL Tuesday #13–A Mysterious Business

It’s T-SQL Tuesday again and Steve Jones (blog|twitter), Mr SQLServerCentral, is hosting this month. Steve has provided a topic that is near and dear to the heart of pretty much every DBA – “What issues have you had in interacting with the business to get your job done”.

I really didn’t know what I could possibly write about this given that I’ve never had an issue with the business.

Stopped laughing yet?

Here’s an actual business challenge that I have had the fun to deal with in the last few months.

 

What is the maximum number of columns you can have in a table?

It’s a simple little question like that which instantly sets off the alarms in your head, especially when it comes from a Business Analyst. What worried me even more was that this individual had a habit of promising the moon on a stick to users without actually understanding the bigger picture of what might be involved.

The BA was delighted when I told him that the non-wide table limit was 1,024 columns (based upon the Maximum Capacity Specifications in BOL). This took the plotz-meter from rumble to gurgle.

I heard nothing more from the BA for a couple of months so figured it had been just a curiosity that I had answered, then…

 

Please deploy this work order to create a new table

A work order showed up in my queue. A script was attached to create a bunch of new tables. One of these tables was to have 514 columns, another 365 and another 234 along with a few others that were in the 30-40 column range. That’s some pretty wide tables. Each of these tables was to also have a whole host of indexes (more than 40 on one of them) and were expected to contain several million rows of data. There were also FK constraints (well at least there was some DRI).

With an impending sense of doom I asked why we were creating such wide tables and how they were going to be used. I had a hope that these were going to be used in some kind of OLAP solution. It was a faint hope given that we don’t have Analysis Services deployed anywhere.

I was advised that data was going to be loaded into the tables using a “real-time” ETL process to provide reporting functionality to the user base. There would be a bunch of reports written which could be called, there would also be ad-hoc reporting access to a small group of power users.

I could understand if all of these tables were denormalized to the max however to actually get useful data out would require performing a minimum of four joins.

 

The business needs them

When taken back to the development team they stated that they developed this to the requirements that they were given. I can understand and appreciate this, they are working with what they are provided. I have no doubt they would have questioned this requirement and just been asked to proceed.

I performed a code review and sent back my recommendations. Narrower tables, denormalization, fewer indexes (given that they could never cover any of the queries that would be executed). I also explained the huge performance issues that we were going to experience, the impact to the other databases on the server and how we were going to see some quite significant disk issues.

My recommendations were returned with “these need to go in, the business needs them and we have made a commitment to get them in by next week”.

 

You’re there to keep the boss happy

My concerns were obviously not being heeded so it was escalation time. I went to my boss and explained the situation. I went over the concerns I had, the expected performance problems we would experience, the bad design, the lack of denormalization which would have been the ideal solution for what the team was attempting to do.

It sat with the boss for a couple of days. Finally he got back to me. It had gone up the chain and come back down again. The decision was that “the business” needed this and until I could provide hard numbers that this was a problem in production it would go in.

That was a bit of an epiphany moment. I found out that I am not actually there to do the best job that I can. I’m not there to get the most out of each and every server, eek out a little more performance from a SQL instance and get things running all nice and smooth. I came to the realization that I was just there to keep the boss happy.

 

Performance isn’t everything

The new tables went into production. Disk queuing went through the roof, buffer cache went through the floor and my head went through the desk. Tickets were opened because things were performing so poorly, they were rerouted to the BA and closed as “expected behavior”.

I provided perfmon stats and I showed the severe impact that these new tables and processes were having on our environment. I sat down with management and explained what was going on over and over again figuring that some change would be made.

I tracked down a few of the power-users who were a part of “the business” and asked them what they thought of the whole situation. To a person they told me that this was not what they wanted, not what they asked for and not what they needed.

 

These tables still exist. Performance is still lousy. I’ve stopped taking this to management as it wasn’t keeping my boss happy. I would love to say that the business won, but it doesn’t actually look like the business did. It seems more than likely that this was the brainchild of one BA who got the ear of someone important and made their dream a nightmare reality.

Quick Powershell Disk Space Check

Today I received a request to copy some database backups over to a file share. As a part of performing this process I generally log on to the target first to check that there is sufficient free space to complete the copy. As I was waiting for the server login to complete I knew that there really had to be a better way to do this.

There are lots of tools out there that can get this data however I realized that Powershell could do just this quickly and easily. In less than an hour I had a working function deployed to my profile that I could quickly and easily call to get the information on all disks on a machine.

 

   1: Function DiskSpace ($Servername)

   2: {

   3: GWMI -ComputerName $Servername Win32_LogicalDisk -Filter "DriveType = 3 AND Size > 0" | 

   4:     Select SystemName, 

   5:     @{Label="Drive";Expression={$_.DeviceID}}, 

   6:     VolumeName, 

   7:     @{Label="Size GB";Expression={"{0:N1}" -f($_.size/1GB)}}, 

   8:     @{Label="Free GB";Expression={"{0:N1}" -f($_.freespace/1GB)}} , 

   9:     @{Label="Free Percent";Expression={"{0:N1}" -f(($_.freespace / $_.size) * 100)}} | 

  10:     Out-GridView

  11: }

 

Once added to your profile (for more about profiles check out http://msdn.microsoft.com/en-us/library/bb613488(VS.85).aspx ) it’s a simple case of opening powershell and calling the function along with the machine name.

DiskSpace Server1

You can also provide multiple servers to get results. Just separate the servers with a comma.

DiskSpace Server1,Server2,Server3

 

UPDATE:

Shahryar Hashemi (Twitter) pointed out quite correctly on Twitter that this wouldn’t cover mount points (as I only have them on a single machine I hadn’t thought to cover that). I’ve updated the script to use Win32_Volume which gives us information on the mount points. This basic script now won’t give you the volume names however it will give you the path to the mount point. 

Code Snippet
  1. Function DiskSpace ($Servername)
  2. {
  3.  
  4. GWMI -ComputerName $Servername Win32_Volume -Filter “DriveType = 3 AND Capacity > 0” |
  5.      Select SystemName,
  6.     @{Label=“Drive”;Expression={$_.Caption}},
  7.      ##VolumeName,
  8.     @{Label=“Size GB”;Expression={“{0:N1}” -f($_.Capacity/1GB)}},
  9.     @{Label=“Free GB”;Expression={“{0:N1}” -f($_.freespace/1GB)}} ,
  10.     @{Label=“Free Percent”;Expression={“{0:N1}” -f(($_.freespace / $_.Capacity) * 100)}} |
  11.     Out-GridView
  12.  
  13.     
  14.  
  15. }


Also, check out what Dave Levy (blog|twitter) has done with this http://adventuresinsql.com/2010/11/get-drive-space-including-mount-points/

 

UPDATE 2:

Based on the great suggestion by Dave Levy on being able to provide a unit of measure for the function (so it can return capacity and freespace in something other than just GB) here’s another updated version of the script. This time pass in the server name and the unit of measure you want to use (KB/MB/GB). If you don’t pass in a value it will give you the space information in GB. Just a note, if you pass in something invalid you won’t get space information at all. Like I said, quick and dirty, there’s no error handling going on here folks.

Code Snippet
  1. Function DiskSpace2 ($Servername, $unit)
  2. {
  3.  
  4. IF (!$unit) {$unit = “GB”}
  5. $measure = “1$unit”
  6.  
  7. GWMI -ComputerName $Servername Win32_Volume -Filter “DriveType = 3 AND Capacity > 0” |
  8.     Select SystemName,
  9.     @{Label=“Drive”;Expression={$_.Caption}},
  10.     ##VolumeName,
  11.     @{Label=“Size $unit”;Expression={“{0:N1}” -f($_.Capacity/$measure)}},
  12.     @{Label=“Free $unit”;Expression={“{0:N1}” -f($_.freespace/$measure)}} ,
  13.     @{Label=“Free Percent”;Expression={“{0:N1}” -f(($_.freespace / $_.Capacity) * 100)}} |
  14.     Out-GridView
  15.  
  16. }


 

Give it a try, let me know if this comes in useful.

UnSQL Friday #1: Branding

The fine MidnightDBA (blog | twitter) team brought up the concept of an UnSQL Friday to go along with the T-SQL Tuesdays. In this, the inaugural post, the topic decided upon was branding.

There is a lot written about branding on the web and there have already been some great posts on the topic as a part of this UnSQL Friday (such as http://nuurdygirl.blogspot.com/2010/11/brandinga-kinky-necessity.html).

 

I discovered my brand last week at the PASS Summit, and it’s interesting as it’s a case of accidental branding.

When I joined twitter a couple of years ago it was a nothing thing and I used it very little at first. As time went on I got more into it following folks, tweeting and using the #SQLHelp hashtag for when I ran into problems or saw folks that were having problems that I could assist on. Initially I had no profile picture. I’m really not a public person and so didn’t use a photo of myself and decided upon this funny image of a bush baby that I found online Twitter_image I’ve not changed the image even though quite a few times I’ve thought of doing so.

Last week at PASS I was meeting people and being introduced. Folks didn’t recognize my name, they didn’t recognize my twitter handle (@anonythemouse) however the vast majority of people I interacted with who I knew from twitter recognized the bush baby image. In fact I was introduced on more than one occasion to people as the guy with the bush baby picture on twitter.

It seems that even without deciding upon a brand I had one anyway.

Checking SQL Nodes With Powershell

Have you ever wanted a quick way to check and see what physical node your SQL Server Instance was running on? Yeah, me too.

Within SQL itself this is pretty simple, you can run a quick query

Code Snippet
  1. SELECT serverproperty(‘ComputerNamePhysicalNetBIOS’)

 

This doesn’t work in SQL 2000 though (and sadly I have a few SQL 2000 Instances hanging around). It also requires you to crack open SSMS, connect to the server and run the query.

While working on something else in Powershell I accidentally discovered a dirty rotten hack that will actually give me the information that I am looking for no matter what the SQL version.

Here’s how the hack works out. We use WMI and the Win32_LogicalDisk information to capture the physical server name. We actually pass in the virtual SQL Server name, however it returns the information for the physical. This is based around the fact that a server has a C: drive, so if you systems don’t then you might want to change the filter condition.

 

   1: Function PhysicalNode ($SQLInstance)

   2: {

   3: gwmi Win32_LogicalDisk -ComputerName $SQLInstance -Filter "DriveType=3" | 

   4:     where-object {$_.DeviceID -match "C:"} |  

   5:     Select @{Label="SQL Instance";Expression={"$SQLInstance"}}, 

   6:     @{Label="Physical Node";Expression={$_.SystemName}}

   7: }

Once added to your profile (for more about profiles check out http://msdn.microsoft.com/en-us/library/bb613488(VS.85).aspx ) it’s a simple case of opening powershell and calling the function along with the virtual machine name.

PhysicalNode SQLVirtual

In this example we are looking for the node running SQLVirtualInstance1.

 

Give it a try and let me know if it works for you.

SQLSaturday #54 Recap

This weekend I had the good fortune to not just attend but also present at SQLSaturday #54 down in Salt Lake City, Utah.

 

Thursday night Nuurdygirl (blog|twitter), DorkDog and I got in the car and started the long drive down from Seattle to Salt Lake. DorkDog is Dorky

14 hours, one wrong turn and 900 miles later we arrived with plenty of time to spare before the speaker dinner which consisted of a little food and a lot of louds laughs.

I was able to get a good nights sleep and found myself up nice and early to get to the event. The cloudy skies saw the speakers and volunteers arrive where Pat Wright (blog|twitter) had coffee and bagels waiting (along with donuts for the speakers, cos we’re special).

 

While the crowds arrived I hung out with some of the other speakers; Denny Cherry (blog|twitter), TJay Belt (blog|twitter), Bill Pearson (twitter), Randy Knight (twitter) and Jason Brimhall (blog|twitter).

 

All too soon it was time for my first session “Backup & Restore Basics”. I was surprised and pleased at the size of the turn out. The group I was presenting to was very engaged and asked some great questions. Other than fat fingering a CTRL-R and some good natured joshing from Pat and Jason in the peanut gallery things went very smoothly. I did learn one important thing…when presenting make sure you have copious amounts of water on hand as the thirst kicks in big time.

 

My session over I decided to sit in and watch Denny Cherry at work. If you’ve never seen Denny present I’d highly recommend attending a session sometime. There’s no particular session you need attend, you’ll find yourself learning something and being massively entertained along the way (just be aware, he’s not always ever pc).

 

While the attendees munched on lunch pizza I did final preparation for my other session of the day “Be Safe Out There: Ship Those Logs”. There were far less people in this session and fewer questions so I was able to breeze through this very quickly and left folks with a couple of minutes break before heading into their next sessions.

 

The remainder of the day saw me hanging out with the other speakers, finally getting some lunch (thanks Pat) and laughing a great deal.

I’d really like to thank Jason for saving us from someone who shall just be known as SQLStalker, and acknowledge TJay for apologizing to the poor woman who found herself in SQLStalkers sights.

 

The end of the day found me absolutely exhausted and I was out cold asleep by 9pm. This wasn’t all bad though as I was well rested to take the long drive back home on Sunday.

 

All in all it was a great event, I got to meet some awesome people and felt good about my presentations.

A HUGE thanks to Pat and TJay for organizing this event. They are already looking towards putting on another SQLSaturday event in September next year. I know that I’ll be there, hopefully you can make it too.

Presenting at SQL Saturday #54

I’ll be at SQLSat #54 this weekend 10/23 and will be presenting a couple of topics.

 

Backup & Restore Basics

The key to being able to recover data in an emergency is to have a good backup and to know how to restore that backup to get up and running again as quickly as possible. In this session we’ll talk about how to design a backup and restore strategy that’s a good fit for you, including discussion around the types of backups that can be taken. We will also work through a couple of demos showing backup and restore in action. After all a backup is only good if you can restore it.

 

Be safe out there: Ship those logs

What happens if your hardware fails or your data center loses power? Do you have a Disaster Recovery (DR) plan in place that you can implement quickly? Log shipping allows you to keep copies of your databases closely synced and ready to come online in an emergency. Log shipping also offers options for read-only warm standby databases to reduce load from your production servers, adding additional value to your investment. In this session we’ll Log Shipping, how it works, and how it differs from its cousin, Mirroring. We’ll discuss practical uses for Log Shipping and work through a demonstration to show you how you can get it up and running in your environment.

 

Be sure to find me and say hello if you’re attending.

CLR Errors – Follow Up

A few weeks ago I posted about an issue I was having with CLR failing on one of my production servers. While a reboot fixed the problem it was a temporary fix and happened on a couple of further occasions. Still not being able to find the root cause Microsoft recommended a complete removal and reinstall of the .NET framework.

One of the Microsoft support team advised running the following query which loads CLR and then traverses through all the registered assemblies. This quickly shows up any problems with CLR and SQL Server integration even if the problem is not showing elsewhere.

 

  1. select * from sys.assemblies

 

Seeing the following is not a good thing:

Msg 6512, Level 16, State 27, Line 1

Failed to initialize the Common Language Runtime (CLR) v2.0.50727 with HRESULT 0x80004005. You need to restart SQL Server to use CLR integration features.


Removing the .NET Frameworks

Firstly I downloaded the .NET Framework Cleanup Tool from http://blogs.msdn.com/b/astebner/archive/2008/08/28/8904493.aspx

With the tool installed on the server and all SQL services shut down and set to manual start I performed an uninstall of all of the .NET Frameworks in the order

  • 3.5
  • 3.0
  • 2.0

The removal complete a server restart was in order. Once restarted it was time to

 

Reinstall the .NET Frameworks

Here’s the interesting thing, there’s actually no need to go through and reinstall each individual Framework, in fact just download and install .NET Framework 3.5 SP1 from Microsoft.

Once the install was complete and the server restarted I brought the SQL services back online (and reset their services to start automatically)

 

Checking sys.assemblies again gave me positive results

 

name      principal_id           assembly_id          clr_name               permission_set     permission_set_desc          is_visible                create_date                modify_date         is_user_defined

 

Microsoft.SqlServer.Types 4              1              microsoft.sqlserver.types, version=10.0.0.0, culture=neutral, publickeytoken=89845dcd8080cc91, processorarchitecture=msil          3              UNSAFE_ACCESS  1              2008-07-09 16:20:00.070                2008-07-09 16:20:00.413 0

 

Since the reinstall I’ve not experienced further issues. It’s a shame that Microsoft were unable to give me a root cause to the problem, however it’s great to have it resolved.

Are You Watching The Midnight DBAs?

If you aren’t watching you should be.

 

Sean and Jen McCown are funny, down to earth folks to provide fun and entertainment with a bit o’ learnin’ in their weekly shows. Watching MidnightDBAs on the big screen

They also blog, rant, post training videos, moderate forums and lots more.

There’s really not a good reason for you not to watch. Even if you can’t make the live ustream events you can always watch past videos at MidnightDBA.com.

 

Here’s a preview for Season 2