Category: Uncategorized

What’s In A Name?

Back in November (which seems so very long ago) I wrote a post for UnSQL Friday #1 on branding. Feel free to go and read that, I can wait…

…while everyone else is off reading that here’s a quick synopsis, I fell into an accidental brand with the bush baby image, this despite few people knowing and fewer people being able to pronounce my Twitter handle @anonythemouse (Anony-The-Mouse).

 

Welcome back.

Recently I’ve been giving a great deal of thought as regards branding and decided that it was about time that I actively managed my own brand. I can hardly claim that I’m anonymous any longer given the blog posts, activity on Twitter and presentations (@AnonyTheMouse being a play on anonymous). As such I’ve decided to change things and will be sporting a new Twitter handle from next week and there will be an accompanying new blog to match up with it. There is also going to be, in the future, a new logo *say it ain’t so Nic, say it ain’t so*. Sorry, the bush baby needs to rest those eyes, they must be feeling really strained after staring for so long.

 

I really want to thank Cami Peacock (blog|twitter), Sean McCown (blog|twitter) and Jen McCown (blog|twitter) for being awesome sounding boards and coming up with great and creative thoughts across the whole branding spectrum.

I’ll be posting more details as things change.

SQLSaturday #68 Recap

This post has been a long time coming. It’s been two weeks since SQLSaturday #68 in Olympia, WA. A busy workload and dead laptop meant that this post is extremely late in summing up what was a new experience for me in presenting.  

 

Om nom nom

Living (relatively) locally in WA there was no big travel for me to this event. That was nice, although the drive down to Olympia from Bellevue on a Friday afternoon is not the greatest. Luckily the sun was out and the weather warm so I wound down the windows and played bad music while sitting in what seemed like endless traffic.

 

The speaker dinner was, as always, a great event. It’s so great to just be able to sit down and speak to fellow professionals in an informal and relaxed environment. I was fortunate enough to sit and have my meal while listening to the conversation of Buck Woody (blog|twitter), Crys Manson (blog|twitter),  Scott Stauffer (blog|twitter) and John Halunen (twitter).

 

Post meal I followed my usual pre-evening presentation schedule by going over my demos to check they were all working and then watching the DBAs At Midnight web show.  

 

When things go wrong 

Saturday morning saw me arrive at the South Puget Sound Community College event location reasonably early where I ran into Mike Decuir (blog|twitter). We wandered in and headed right for the chocolate muffins before hitting registration. 

As people started filtering in I hung around the speaker area chatting with folks until the first sessions started. 

Rather than attend the first session of the day I sat down to bring up all of my virtual machines and get my demos in ready to go so that I could walk into the presentation room, plug in and I would be all ready to talk PowerShell profiles. I spun up my four VMs and to my horror I found that the evaluation Windows 2008 licenses had expired, all of my iSCSI targets had vanished and my cluster was dead. 

Not good. Deep breath time.  

 

Backup plans 

I’m a DBA, I always have a backup. I knew I could run the presentation from my laptop, just that some of the functionality that I wanted to show would be missing. I managed to get a VM up and running so decided to work with that. I’d never had to go to that DR presentation plan before so this was a new thing for me. I was more than a little nervous to see how this would go.  

 

Presenting: the backup plan 

I walked into the presentation room hoping to get twenty or so folks in attendance. When I got there a lot of the seats were already filled and then more people started coming in. All the seats were taken and still more people came in. Now the nerves really kicked in because I didn’t want to let all these folks down. Luckily there were a couple of friendly faces in the crowd. Cameron Mergel (blog|twitter) and Hope Foley (blog|twitter) were up near the front. It really helped knowing they were there.  

Things started reasonably well and then about 10 minutes in my VM shut down. At this point I was ready for just about anything to be thrown at me, so I started it up again and kept on talking. I felt myself talking a little too fast and tried to slow down a little and worked to get some questions from the audience to help the timing. The crowd was well engaged and asked some great questions. My time went by very quickly and I was very happy to see that I had as many people in the room at the end of the session as I’d had at the start. 

Things done and I had several folks come up and thank me and say that they had enjoyed the presentation which was nice and very well received. The slides and demos are available for download here

Frankly I was exhausted at this point and just want to take a nap, but went and grabbed lunch instead.  

 

A mellow afternoon 

After eating my lunch and spending a little time chatting to some folks I got what little wits I had left and went to watch Buck Woody talk Cloud computing. You can tell Buck is an educator, his sessions are so easy and free flowing and he obviously knows the material inside and out. 

I moved from my chair to a more comfortable seat to enjoy Chuck Lathrope (blog|twitter) in the final session of the day talk Transactional Replication. I’ve dealt with Transactional Replication for years and I’m pretty comfortable with it however I know that Chuck always has nuggets that I can take away and work with so when he talks it’s always useful to me. 

The day done there were a few raffle giveaways and folks descended upon a local Tavern for a bite to eat and something to drink.  

 

Lessons learned 

This little experience taught me a couple of things. 

Firstly, never build your presentations on evaluation copies of software, it’ll come back and bite you in the rear. I’ll be purchasing an MSDN sub in the next couple of weeks to ensure that doesn’t happen. 

Secondly, it can be really exhausting presenting. I’m thinking power naps for me after speaking from now on. 

Thirdly, practice the DR portion of the presentation. Figure out what could go wrong and the mitigation strategy. While I knew I could get by with a lot less I had not built a second presentation around that and had to change a lot on the fly. I won’t be making that mistake again.

Goals Update

Back at the start of the year as a part of T-SQL Tuesday #14 I posted a few goals that I wanted to achieve this year.

A quarter of the way through the year I decided to throw out a quick post to see where I stand on those goals.

 

Certification

My plan was to get my Database Development MCTS in Q1 and follow up with getting the MCITP in Q3. Things actually went a lot better than I thought they would with this. I came out of the MCTS feeling good, even though my passing was a close thing.

I had heard that the MCITP for the development exam was easier, and as I had not used the second shot opportunity with the first I decided what the heck and signed up for that exam two weeks later.

The MCITP was a breeze compared to the MCTS and I passed that and left the test center holding both the MCITP: Database Administrator 2008 and MCITP: Database Development 2008 certifications.

Goal Status: Completed

 

Presenting

My goal was to present at four events this year and to submit a session to PASS.

I was picked to present at SQL Saturday #67 in Chicago. That was a couple of weeks ago and I thoroughly enjoyed myself there and blogged about the fun.

Upcoming is SQL Saturday #68 in Olympia, WA. I was not originally going to submit to this event however due to a change in circumstances I was not able to take the vacation I had originally planned. I was again lucky enough to get chosen to present, this will be my second of four (planned) for the year.

The call for speakers for PASS has not opened yet so I have not had the opportunity to submit my session.

Goal Status: 50% on presentation goal

 

Blogging

Blog frequently and include shorter posts just to say something quick.

I went through a job change at the beginning of March which really ate into my time and prevented me from writing posts, reading posts or following Twitter. Now I have a handle on the new position I have found a little more time and can refocus back on this. I’ve still managed to average a post a week this year so far so I’m ahead of last year.

Goal Status: Not bad, could be better. Need to refocus on shorter posts. No need to wax lyrical all the time.

 

Personal

Submit citizenship paperwork.

I need to still get a couple of pieces of paperwork for this. I should be able to pick those up this month and get the paperwork submitted.

Goal Status: Not there yet

 

Hidden Goals

One of my chief goals this year was to find myself a new job. I was sad, bored and unhappy in my old role. This is obviously one of those things that you can’t just stick out there on a blog when you are searching.

Happy to say that I found a position that was a good fit for me and that has a lot of things upcoming that I can leverage my existing experience in. There are also plenty of growth opportunities for me in the role and I will get to play with aspects of SQL Server that I would not have touched otherwise.

Goal Status: Oh so achieved!

 

New Goals

As the year progresses things need to be reassessed.

I’m not changing any of the goals that I already had set. I think they were smart and warranted. I have found myself in a place where I should be able to work towards more than I already had as such I am adding a couple of new goals:

  • Take MCM written test before the end of Q3
  • Get motorbike endorsement on my license
  • Build decent home lab

 

I’m well ahead on my certification goal. Actually I achieved that goal. I questioned internally whether I could actually complete this goal. My confidence level was not high. Fortunately my girlfriend Cami (blog|twitter) has more belief in me than I do in myself and persuaded me that I could do it. Given that she didn’t steer me wrong with the certifications before and she has told me that I could get the MCM I’ve decided to work towards that end. I’m certainly not in the position to take the written at this point, however I feel I could knuckle down and study for a few months, leverage my experience and have a reasonable shot of being able to sit down and get a few of the questions right in the exam come September time.

Prior to moving to the US I had a license to ride motorcycles. When I moved here and took my drivers test to get a US license I did not go through the work to get the bike one at the same time. This has left me without one. Now I live in the Pacific northwest I look around at all the folks zipping around on their bikes and get jealous. I’ve been talking for the last two years about actually getting this endorsement, it’s time that I got off my rear and actually got it.

It’s frustrating when you want to try something out and you don’t have the space or horsepower on your laptop to try it. I’ve been wanting to build myself a home lab for a while now. I’ve even had chats with Sean McCown (blog|twitter) on the best path towards this. I believe that this is something that I can get done by the end of the year. I just need a box or two and a bunch of storage with some virtual machines. Heck I’ve done that on my laptop, I just need to scale it up. 

 

Next follow up on my goals to come the first week of July.

Presenting At SQLSaturday #68

This weekend (4/9/2011) I will be speaking at SQLSaturday #68 in Olympia, WA. My session is slotted in at 10:30am. It will be a reprise of the presentation I gave at SQLSaturday #67 in Chicago titled “PowerShell: Are you checking out my profile?”

I think the title sums it up pretty well, but in case you need more details: Powershell is a very powerful management tool and you can spend hours writing magical scripts to do all the things you want. Frequently forgotten is the Powershell profile, a place you can add your own functions which can provide you lightening fast access to information. We’ll talk about the power a profile puts at your fingertips. I will also demo (and share) a bunch of short functions that I use frequently for common tasks. I’ll show you my Powershell profile if you show me yours.

If you decide to attend my session don’t expect much in the way of pretty slides, extended powerpoint information and the like. This is going to be heavy on the demo side of things and interactive, so bring your questions, thoughts and ideas.

SQLSaturday #67 Recap

This weekend I had the pleasure of attending and presenting at SQLSaturday #67 and boy do these folks in Chicago know how to put together an event!

 

Friday night lights

I flew in Friday afternoon and after dropping my bag off at the hotel I headed down to Devry to assist in the prep work for the next morning. When I arrived things were already well in hand and a group of volunteers were hard at work stuffing swag bags. Finishing that led to collating raffle tickets and name badges so that everything was ready for the attendees arrival Saturday.

If you are planning on attending a SQLSaturday event, be it local or remote, please volunteer to assist in the setup. Not only can these events not happen without someone like you being there to help get things ready, but you also get to meet some really cool community people and chat with them. It’s very rewarding and you have no excuse not to.

Once everything was prepped and ready we headed over to a local eatery where SQL Sentry (of the awesome and free Plan Explorer tool) opened up their wallet and paid for the speaker dinner. Please note, this was the first (of many) instances where I looked around the room and inside my head the following played….  

 

TISWAS

Today Is Saturday, Wear A Smile.

As a kid one of my favorite shows was on ITV, it was a hilarious, pie throwing, dog spitting, fly dying couple of hours on a Saturday morning. I loved it. It used to provide that kind of inane grin that you just couldn’t wipe off your face, it was just so enjoyable. That pretty much sums up this event.

TISWAS–Insane and awesome

 

I was there nice and early, ready to help if there were any last minute things that needed doing but it was all in hand. I got there just as the bagels and donuts showed up. It was just luck it happened that way, really (oh and it was so good to have Dunkin Donuts again, we don’t get those here out in the PNW).

Attendees started to show up in droves and it started to get crowded pretty quickly so I escaped to the relative quiet of the speaker room.

There was a scary amount of knowledge in that room. Take a look at the session list and you’ll see what I mean. I think that pretty much any question anybody could have come up with as regards SQL Server could have been answered by the combined smarts in there. Did I feel out of place? Heck yeah. That is my issue though because none of the presenters in there were ever anything but awesome. You never get looked down on by these folks, they just treat you like a peer, which I think makes it all the more humbling.

 

Presenting

Rather than attend one of the first sessions of the day I decided to put my demos through their paces.

A week before the event I decided to change things up on my session and make the demo’s show their stuff a little better. This lead me to go out and buy a 128GB SSD drive and 8GB of RAM for my laptop and my reinstalling Windows. Then I built 4 shiny new Virtual Machines including a DC, iSCSI targets, a cluster and 3 instances of SQL Server.

I’d had some difficulties getting a mount point working in my cluster before the event but that morning I was able to get it plugged in and working (which probably doesn’t mean much when it comes to the presentation however I was glad to have the functionality show in the demo that this provided). I tested all the demos and felt pretty confident that I was as prepared as I was going to get.

My session “PowerShell: Are you checking out my profile?” was very well attended. I felt it went well with plenty of audience engagement, and the evals handed in seemed to back that up.

The presentation materials are available for download from http://www.sqlsaturday.com/viewsession.aspx?sat=67&sessionid=3183

 

Remains of the day

After a great lunch I found myself watching Jeremiah Peschka (blog|twitter) talking refactoring. I love his presentations, they flow so well and his slides are artistic and just a punctuation mark to what he is saying. I want to be able to present like that when I grow up.

I ended up missing the second round of afternoon sessions as I was talking clusters with Allan Hirt (blog|twitter). Yeah, I missed out on watching a great session, but it’s not often you have to chance to sit and talk face-to-face with a clustering guru, I considered that hour alone worth the weekend.

I was able to catch one last session where Christina Leo (blog|twitter) talked about SQL Internals. I’ll speak more on that a little later.

With the sessions over it was raffle time. Prizes were won and I got a $100 Amazon gift card. Quality!

The day official over things got more crazy. Food was eaten, beverages were consumed, karaoke was sung and talking was had until the wee hours of the morning. A really fun night to wind up the awesome day. I even managed to get a couple of hours sleep before heading for the airport and my flight home on Sunday.

 

Something to change

Christina’s session was on SQL internals. This was her first presentation and it was one on a highly difficult and complicated subject. She did a fantastic job, starting things off with a story and tying it back into the SQL internals. Unfortunately there was one highly obnoxious person in the audience who, for whatever reason, decided that he was not going to agree with anything Christina had to say. He openly challenged her and other people multiple times (and it’s not like he was correct in what he was saying).

It was incredibly disrespectful to someone who was freely giving their time to try and impart knowledge on others. On top of that it took away from the time of the other attendees. At the end of the session more than one person commiserated with Christina on the actions of that one person and told her she did a fantastic job.

If you are attending a session show respect for the person speaking. Even if you think that they are saying something wrong don’t openly challenge them. If you can’t keep yourself from berating the presenter then leave the room and speak to them afterwards in private.

SQLSaturday organizers, I would really like to see a buddy system be implemented. While there are some very experienced speakers at these events there are those that are new to speaking. I think that everyone who is a first time presenter (at the very least) should have an experienced speaker with them in the room just in case they run into this kind of situation. They might not need to interject but the knowledge that they are there and have the back of the speaker would mean a great deal. A nod of the head just to reinforce the speakers confidence at a key moment might well mean more than standing up and taking someone to task.

When someone learns to walk a tightrope they have a safety net. Let’s get one in place for new speakers, we don’t want some callous comment or malicious person to deprive the community of a great speaker.

 

Awesomesauce organizers

As I’ve said this whole event was awesome and I want to call out the team that put it all together:

And a final mention goes out to Jes Schultz Borland (blog|twitter) who showed up Friday with a great many boxes of t-shirts for the attendees.

What Should PASS Be?

A few days ago Andy Warren (blog|twitter) asked the question “What should PASS be?” and issued a challenge for people to throw their own thoughts in about directions the organization should head.

I can’t provide any answers to that without first explaining:

 

What PASS Is To Me

I attended my first PASS Summit back in 2008. I’m far from a social butterfly and I didn’t know anybody when I got to the Summit, I walked around for a week and left having learned a great deal but still knowing nobody.

Funny thing though, attending that Summit is what got me started with being involved in the community at large. While there I saw some signs touting the benefits of Virtual SIGs and went over to ask about them. Blythe Morrow (blog|twitter) helped me get signed up for the DBA one and explained to me how useful this whole Twitter thing could be. Not being captain social I wasn’t sure I’d ever use it but signed up anyway. Nothing to lose right?

Over time I attended a few of the virtual groups and started following folks on Twitter. Then I started interacting. I got to know people; learned a huge amount; started blogging; attended a SQL Saturday; presented at a SQL Saturday; attended the local user group frequently; volunteered for and went to the 2010 Summit.

What was different about the latest Summit? I went in knowing a few folks and came away knowing a lot more.

For me PASS was about getting me started in the community.  Now that PASS owns the SQL Saturday brand it’s even more about getting folks started in the community than ever before.

 

What PASS Is Not To Me

I don’t go to PASS for daily community involvement. It’s not the right place, that’s the niche that Twitter fills. I don’t go there for technical content, that’s why I have 115 RSS subscriptions in Google Reader.

The PASS website is not a place I visit often. Fact is I rarely go to the PASS site, there’s nothing there that I need for the most part. Other than trying to find out information about the Summit or an upcoming event such as 24 Hours Of PASS.

PASS is not forums, that’s why SQLServerCentral and StackExchange exist. Nor is it a daily or weekly newsletter.

 

What Should PASS Be?

The nominated Board of Directors have a tough job and I think for the most part they do it very well. Treading that line between community and company cannot be easy. I know that I’ll never sit on that board and so have to admire the dedication of those that do. Saying that, the Board needs to take a great deal of care with the nominations after the controversy of last year. How about things being a little less political and more straight up?

2011SummitThe PASS Summit is the keystone that holds everything else in place.  Keep on keeping on with that. The one thing I would like to see is something that’s already in the works and that’s moving the Summit every couple of years. While I leave in WA making it simple to get to the Summit for me I know that it’s not the case for everyone. There has been a great deal of talk about this. I think it’s good to move it every once in a while.

SQLRallyThe first SQLRally is due to take place this year. It’s being billed as a regional event although it’s more like a mini-summit, I like this idea. Do it again next year but be sure to move it around the country. Dallas, Chicago and Atlanta will all hopefully be in serious consideration for next year. It might also be worth considering having the SQLRally in Seattle those years that the Summit is moved.

I’d look at possibly changing one thing for SQL Saturdays. Maybe a small fund could be set up to ensure that a highly respected speaker or two could make it to each event. This would cover their travel expenses and maybe a put a few bucks in their pocket. Other than that I would like PASS to really keep their distance from these events. A non-intervention pact should be put in place to allow each event to continue to run independently. I believe without this independence something new will spring up and take it’s place.

The question has been asked, by Sean McCown (blog|twitter) as to whether or not mentors are dead. I don’t want to believe that they are. PASS should try to get involved and start an official mentorship program. I’m sure there are a lot of experienced DBAs, developers and BI folks out there who would gladly help bring along someone still in their early days of working with SQL. Why not bring these folks together. Offer up some kind of support program to help facilitate the contact between folks and continue that relationship by seeking feedback from both parties.

24HOPDon’t change anything for 24 Hours of PASS. It’s great. Just be sure not to exclude speakers, give the community a chance to vote on every submitted session. Thomas LaRock (blog|twitter) has taken on the mantle of owning 24HOP. His idea for the March 2010 24HOP having only women speakers I think is great, it really helps promote diversity in the speaker pool which takes me to my final point.

There are some great speakers out there, PASS needs to help groom the next generation. This is one of the areas I think that could be huge for the community. It would be great if PASS could introduce some kind of speaker  program that included feedback, evaluations and training. It would really help take people from SQLSaturday/local user group to PASS Summit speaker and beyond.

 

Final Thought

PASS Board, keep on keeping on. Overall you’re doing a great job. Keep getting people involved with volunteer efforts, run the Summit, Rally and SQL Saturdays. Don’t try and move into areas already well established with other companies, websites or tools.

Fun With Windows Logins In SQL

Sometimes you come across quirkiness when playing with SQL Server. Once in a while those things are related to security. This happens to be one of those times.

 

Release My Code

I was provided with a script by the dev team to create a new proc and grant permissions on that proc to a Windows login that already had access to SQL. No big deal. I executed the script, received no errors and happily went about the rest of my day. A couple of hours later I got an email from the developer saying that when they attempted to call the proc they were getting an error stating that the server principal was not able to access the database under the current security context.

 

Fix It Damn You

After much troubleshooting to no avail I ended up scripting out all the users permissions, dropping and recreating the user, then applying those permissions back again. Everything worked and the developers went away happy. I myself was not happy and so restored an old copy of the database to a test server to try and find out what the deal was. I found something interesting. Join me in a short but fascinating journey into the world of the Windows login…

 

Setup A Test

The following is based upon test cases in SQL 2008 and 2008 R2. I have not tested on SQL 2005 and so do not know if the behavior might be different.

Create yourself a new shiny Windows login on your machine and then. Mine’s called SQL1 (because I’m original like that). Now add that as a login within SQL.

CREATE LOGIN [AnonyLPTPSQL1] FROM WINDOWS

GO

Just because validation is good it’s worth double checking that the login exists

SELECT Name, type_desc FROM sys.server_principals 

    WHERE name = 'AnonyLpTpSQL1'

    ORDER BY name

 

Looks like we’re good on that front.

Now let’s create a dummy database for our security test and add a table and proc to that.

CREATE DATABASE SecurityTest

GO

 

USE SecurityTest

GO

 

CREATE TABLE dbo.QuickTest (ID INT, Col2 VARCHAR(30))

GO

INSERT INTO dbo.QuickTest VALUES (1, 'some text'), (2, 'different text');

 

CREATE PROCEDURE dbo.ShowData

AS

    SELECT ID, Col2 FROM dbo.QuickTest

GO

Feel free to execute the proc at this point just to be sure that you get results.

 

 Any Users There?

We’ve not added any users to this database, and most certainly that new login we just created isn’t going to be there. In the interests of being sure let’s check real quick.

SELECT Name FROM SecurityTest.sys.database_principals 

    WHERE name = 'AnonyLpTpSQL1'

    ORDER BY name

Nope, no user there.

 

Now For Some Magic

If we try to grant execute on the ShowData proc to that login it’s going to fail because that user doesn’t exist in the database, right? I know for sure that’s what happens with SQL logins, after all I’ve seen the “Cannot find the user ‘<username>’, because it does not exist or you do not have permission” error on countless occasions.

Let’s try it anyway and see what happens

GRANT EXECUTE ON SecurityTest.dbo.ShowCols TO [AnonyLPTPSQL1]

Yup, there’s the error…wait, what? There’s no error, it was successful. That must be a mistake surely?

Well let’s check the database principals again

SELECT Name FROM SecurityTest.sys.database_principals 

    WHERE name = 'AnonyLpTpSQL1'

    ORDER BY name

So it appears as though a user has been created for the login without us needing to specify it to do so. In actual fact it’s gone one step further and also created a schema.

SELECT name, principal_id FROM SecurityTest.sys.schemas    

    WHERE name = 'AnonyLpTpSQL1'

 

 

But Wait…

Let’s try to execute the proc under the context of that user

EXECUTE AS USER = 'AnonyLpTpSQL1';

EXEC SecurityTest.dbo.ShowCols;

 

Hey, that’s the error that the devs were reporting they were having. This makes no sense though. We granted permissions on the proc and there’s a user there and everything, what gives?

 

Security Fundom

So there’s a user and there’s permissions on the proc but we get an error. Most strange. Well, actually I know we granted permissions on the proc, but did they actually make it there? We can use sys.database_permissions to double check this. In this case we’ll check permissions for all the Windows users in the database.

USE SecurityTest

GO

SELECT 

      dpri.name

    , dper.class_desc

    , object_name(dper.major_id) as ObjectName

    , permission_name 

FROM 

    sys.database_permissions dper

        INNER JOIN sys.database_principals dpri

            ON dper.grantee_principal_id = dpri.principal_id

WHERE dpri.type = 'U'

 

Yup, the permissions are definitely there. Execute on ShowCols. So that’s not an issue.

Note, there’s another row in the table showing CONNECT permissions for dbo. This permission is missing for our login. Interesting. Let’s add that for our user.

GRANT CONNECT ON DATABASE::SecurityTest TO [AnonyLPTPSQL1]

Running the permissions script again now gives 3 rows.

Now if we attempt to run the proc under the context of the login.

EXECUTE AS USER = 'AnonyLpTpSQL1';

SELECT USER_NAME(); -- Confirm we're using the right user

 

EXEC SecurityTest.dbo.ShowCols;

 

It worked!

 

So What Is Going On Here?

By attempting to assign permissions to an object in a database to a Windows login a schema and user are automatically created however the login is not granted the right to connect to the database in order to be able to utilize those permissions. We can easily grant the ability to connect to the database that is required for that login to use the already assigned permissions.

This is a very strange behavior (especially given that SQL logins do not work the same way). I’m sure that there is reason behind this and use cases, even though I’ve not been able to find any documentation describing this.

If you’ve run across this before, have use cases or know of documentation describing the behavior please let me know in the comments and I’ll update the post.

Are You Putting Out?–The SQL OUTPUT Clause

I had some code come through the other day from a developer, it was designed to perform an insert and then return the inserted data back to the client for display. This kind of code is probably seen very frequently out there in the real world.

 

Quick, Create A Table

As we’ll be playing around a little lets throw in a small table just for demonstration purposes:

CREATE TABLE OutputTesting

    (

      ID INT IDENTITY(1,1) PRIMARY KEY CLUSTERED

    , RecordDate DATETIME NOT NULL 

    , EnteredValue VARCHAR(100) NOT NULL

    )

 

 

Old School Insert

Here was the developer code that came through:

BEGIN TRAN

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        VALUES (getdate(), 'Some other text')      

SELECT ID, RecordDate, EnteredValue FROM OutputTesting WHERE ID = SCOPE_IDENTITY()

COMMIT TRAN

The first thing of note here is a complete lack of error handling in the statement. Sure we have a begin and commit tran but if there’s a problem with the insert we just don’t deal with it and a transaction will be left open (not to mention nothing being returned to the client).

The second thing to notice is how this would return data back to the client. We would actually perform a query against the table using SCOPE_IDENTITY() in our where clause. SCOPE_IDENTITY() contains the value inserted within the current session and current scope (meaning that it won’t return the value for any table that might have had an insert due to a trigger).

This kind of query was heavily used back in the 2000 days but now there is a much better way to do this. Enter…

 

The Output Clause

Whenever an insert, update, delete or merge is performed you can use OUTPUT to return back the impacted data.

Usage is very simple, just plop the OUTPUT right in the middle there and you are done. So taking the initial example (and for now ignoring the error aspects) we can simply rewrite it as the following:

BEGIN TRAN    

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        OUTPUT INSERTED.ID, INSERTED.RecordDate, INSERTED.EnteredValue 

        VALUES (getdate(), 'Some other text')

COMMIT TRAN

We got back results exactly the same way as we would have in the old school fashion and did so without having to perform the extra read against the OutputTesting table.

 

Where Output Falls Down

Nothing’s perfect right? The thing with OUTPUT is that it will attempt to return results even if your statement fails. Let’s rewrite the query slightly and use the TRY…CATCH syntax to handle errors:

BEGIN TRY

    BEGIN TRAN    

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        OUTPUT INSERTED.ID, INSERTED.RecordDate, INSERTED.EnteredValue 

        VALUES (getdate(), 'Some text')

    IF @@TRANCOUNT > 0

    COMMIT TRAN

    

END TRY

 

BEGIN CATCH

    SELECT ERROR_MESSAGE() AS InsertError

    ROLLBACK TRAN

END CATCH

Ok, great, this worked as expected. Let’s throw a spanner in the works and pass invalid data to the insert and see what happens:

BEGIN TRY

    BEGIN TRAN    

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        OUTPUT INSERTED.ID, INSERTED.RecordDate, INSERTED.EnteredValue 

        VALUES ('2011/01/301', 'Some text')

    IF @@TRANCOUNT > 0

    COMMIT TRAN

    

END TRY

 

BEGIN CATCH

    SELECT ERROR_MESSAGE() AS InsertError

    ROLLBACK TRAN

END CATCH

Hey, what gives? The insert failed (we have our insert error there) yet it still attempted to return results for the OUTPUT.  We could end up with all kinds of problems if we tried to return that back to the client. People would freak out, it would be anarchy!

 

A Better Way To Handle Output

If we’re going to go ahead and use OUTPUT it’s always best to push that data into a table variable and then use that to return results back to the client if the insert is successful, otherwise we can just pass back the error:

DECLARE @OutputResults TABLE (

      ID INT 

    , RecordDate DATETIME

    , EnteredValue VARCHAR(100))

    

BEGIN TRY

    BEGIN TRAN    

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        OUTPUT INSERTED.ID, INSERTED.RecordDate, INSERTED.EnteredValue INTO @OutputResults

        VALUES (getdate(), 'Some text')

    IF @@TRANCOUNT > 0

    COMMIT TRAN

    SELECT * FROM @OutputResults

END TRY

 

BEGIN CATCH

    SELECT ERROR_MESSAGE() AS InsertError

    ROLLBACK TRAN

END CATCH

Now if we this time attempt to insert invalid data:

DECLARE @OutputResults TABLE (

      ID INT 

    , RecordDate DATETIME

    , EnteredValue VARCHAR(100))

    

BEGIN TRY

    BEGIN TRAN    

        INSERT INTO OutPutTesting (RecordDate, EnteredValue)

        OUTPUT INSERTED.ID, INSERTED.RecordDate, INSERTED.EnteredValue INTO @OutputResults

        VALUES ('2011/01/301', 'Some text')

    IF @@TRANCOUNT > 0

    COMMIT TRAN

    SELECT * FROM @OutputResults

END TRY

 

BEGIN CATCH

    SELECT ERROR_MESSAGE() AS InsertError

    ROLLBACK TRAN

END CATCH

All we get back is the actual insert error, so much cleaner.

We should really take the error handling a step further and use RAISERROR and pass in all the parameters, but this is at least a solid foundation and far better than what we had previously.

 

Output Isn’t Just For Inserts

Heck no, output is good for all you DML operations.

Delete will tell you the records you just removed:

DELETE FROM OutputTesting

OUTPUT DELETED.ID, DELETED.RecordDate, DELETED.EnteredValue

WHERE ID = 3

 

Update can give you information on what the old and new values are for the record:

UPDATE OutputTesting
SET RecordDate = DATEADD(d, 1, GETDATE())

OUTPUT INSERTED.ID, INSERTED.RecordDate AS NewDate, DELETED.RecordDate AS OldDate

WHERE ID = 2

 

Merge has a little bonus OUTPUT feature called $action which tells you what what kind of DML action was performed as well as giving you the data:

DECLARE @InputTable TABLE (ID INT, RecordDate DATETIME, EnteredValue VARCHAR(100))

INSERT INTO @InputTable VALUES 

    (1, GETDATE(), 'Some new text'), 

    (2, GETDATE(), 'Some other text'),

    (500, GETDATE(), 'New text')

 

MERGE OutputTesting    as Target

USING (SELECT ID, RecordDate, EnteredValue FROM @InputTable) AS Source

ON Target.ID = Source.ID

WHEN MATCHED THEN

    UPDATE SET Target.RecordDate = Source.RecordDate, Target.EnteredValue = Source.EnteredValue

WHEN NOT MATCHED BY TARGET THEN

    INSERT (RecordDate, EnteredValue)

    VALUES (Source.RecordDate, Source.EnteredValue)

    OUTPUT $action, INSERTED.*, DELETED.*;

 

Output Is Your Friend

Remember folks, old school has it’s place, like a Flock Of Seagulls hairstyle at an 80’s revival concert, but there are new and better things now.

Output loves you, it gives you lots of information and it won’t kill your server to give it to you. Output is putting out, are you?

Keeping MSDB Clean

I have a bunch of monitoring routines running against my SQL Servers. One of these checks available disk space on drives. Today I was surprised to see that one of my SQL system drives was looking pretty full. This is unusual to say the least because I dedicate a drive to the system (master/model/msdb) databases. Upon connecting to the server I discovered that MSDB had blown up to over 12GB in size.

 

Checking For Large Tables

I used the following query to look for large tables in MSDB (this query is completely based upon running profiler against SQL Server while running a Disk Usage By Table report.

SELECT TOP 10

    a3.name+'.'+a2.name AS [tablename],

    a1.rows as row_count,

    (a1.reserved + ISNULL(a4.reserved,0))* 8 AS reserved 

FROM

    (SELECT 

        ps.object_id,

        SUM (CASE WHEN (ps.index_id < 2) THEN row_count ELSE 0 END) AS [rows],

        SUM (ps.reserved_page_count) AS reserved

        FROM sys.dm_db_partition_stats ps

        GROUP BY ps.object_id) AS a1

LEFT OUTER JOIN 

    (SELECT 

        it.parent_id,

        SUM(ps.reserved_page_count) AS reserved,

        SUM(ps.used_page_count) AS used

        FROM sys.dm_db_partition_stats ps

        INNER JOIN sys.internal_tables it ON (it.object_id = ps.object_id)

        WHERE it.internal_type IN (202,204)

        GROUP BY it.parent_id) AS a4 

        ON (a4.parent_id = a1.object_id)

INNER JOIN 

    sys.all_objects a2  

        ON ( a1.object_id = a2.object_id ) 

INNER JOIN 

    sys.schemas a3 

        ON (a2.schema_id = a3.schema_id)

WHERE 

    a2.type <> N'S' and a2.type <> N'IT'

ORDER BY 

    reserved desc,[tablename]

The results were a little shocking to say the least.

  • sysmail_mailitems contained 717,878 rows and was over 9GB in size
  • backupfile/backupmediafamily/backupmediaset/backupset combined had over 8.6million rows and totalled over 2.5GB in their space usage

 

Forgetting To Clean Up

MSDB is something frequently forgotten. Cleaning up job history is nice and easy, it’s just a setting in the SQL Agent. Cleaning up your sent emails and backup records is not as straightforward. Why Microsoft hasn’t included this I don’t know, especially given that they have provided stored procedures to perform the cleanup of this data.

My server has been in service since March 2009. That’s a lot of backups and a lot of emails sent. I needed to get a handle on cleaning this up and then implementing a process to ensure that these tables didn’t get out of control again.

 

Deleting Old Mail Items

Microsoft have provided the procedure sysmail_delete_mailitems_sp to perform cleanup of the sysmail_mailitems table. Two parameters can get passed in:

  • @sent_before – datetime – used to delete mail items sent prior to a certain date
  • @sent_status – varchar(8) – used to force deletion of only a certain sent status

In this instance I was not concerned with what items were going to be deleted, I just wanted to get rid of everything older than 14 days.

Here’s a bad way to do this:

DECLARE @DATE DATETIME = dateadd(d, -14, getdate())

EXEC msdb.dbo.sysmail_delete_mailitems_sp @sent_before = @DATE

Actually let me clarify, this isn’t a bad way to do this if you don’t have 2 years and 9GB of data to delete. In fact this is the script I implemented in a job to continue to keep the table size down. Running this against such a large amount of data just blew up the transaction log, filled the drive and then forced a rollback.

Going a slightly smarter route I performed the deletion in small batches (5 days worth of email at a time) using the following:

DECLARE @maildate DATE = '2009-06-15'

WHILE @maildate < DATEADD(d, -14, GETDATE())

    BEGIN

        EXEC msdb.dbo.sysmail_delete_mailitems_sp @sent_before = @maildate

        SELECT @maildate = DATEADD(d, 5, @maildate)

    END

This took a little while however the transaction log stayed small and it completed without incident.

 

Deleting Backup History

Again Microsoft do a kindness to help cleanup old data. This time it’s the procedure sp_delete_backuphistory which we pass

  • @oldest_date – datetime – the oldest backup history record we want to keep

 

Having learned my lesson from deleting the old mail items I again attempted to perform the delete in batches using:

DECLARE @backupdate DATE = '2009-06-15'

WHILE @backupdate < DATEADD(d, -90, @backupdate)

    BEGIN

        EXEC sp_delete_backuphistory @backupdate

        SELECT @backupdate = DATEADD(d, 15, @backupdate)

    END

I figured this would take a while. Half an hour later it was still churning away. Wait stats showed a crazy amount of i/o going on. This didn’t seem right so I stopped the script and took a look at the execution plan.

A pretty large plan which makes it difficult to look through. Luckily a few weeks ago I had downloaded the completely awesome SQL Sentry Plan Explorer from SQLSentry.net. If you have not installed this FREE tool yet then I recommend you go out there right now, well after finishing reading this post, and get it. It makes execution plans not just readable but sortable, viewable and workable. Just using Plan Explorer once can save you hours. I just want to add I am not associated with SQL Sentry in any manner nor am I getting any kickback, the tool is just that good.

Pasting the estimated execution plan XML into Plan Explorer showed multiple clustered index scans against the backupset table. It seems that a couple of critical indexes are missing from this table.

I was quickly able to ascertain that a where clause was being used on the backup_finish_date, which is indexed, however it needed to return the media_set_id which is not included in the index. Later on in the plan I could see that there was a join against the media_set_id which isn’t indexed.

These missing indexes were causing the massive i/o numbers I was seeing. To resolve I quickly added a couple of indexes.

USE msdb

go

IF NOT EXISTS (SELECT * FROM sys.indexes WHERE OBJECT_ID = OBJECT_ID('[dbo].[backupset]') AND name = 'IDX_BackupSet_FinDate_MediaSet')

CREATE NONCLUSTERED INDEX IDX_BackupSet_FinDate_MediaSet ON backupset(backup_finish_date) include (media_set_id)

 

IF NOT EXISTS (SELECT * FROM sys.indexes WHERE OBJECT_ID = OBJECT_ID('[dbo].[backupset]') AND name = 'IDX_BackupSet_MediaSet')

CREATE NONCLUSTERED INDEX IDX_BackupSet_MediaSet ON backupset(media_set_id)

This done I took another look at the estimated plan and things looked a great deal cleaner.

I kicked off the script again and this time it completed in under 5 minutes. Several million rows were deleted and 2GB of space reclaimed.

 

Keeping Things Clean

Not wanting to ever run into this situation again I added a job with a couple of steps to delete mail items older than 14 days and backup history older than 90 days.

This is done with a quick execution of the cleanup procedures. Thanks to the data being brought current and the job running every night I won’t have to worry about extended run times or MSDB getting out of control.

/* Cleanup old backup history */

DECLARE @backupdate DATE = DATEADD(d, -90, GETDATE())

EXEC msdb.dbo.sp_delete_backuphistory @backupdate

/* Cleanup old mail items */

DECLARE @maildate DATE = DATEADD(d, -14, GETDATE())

EXEC msdb.dbo.sysmail_delete_mailitems_sp @sent_before = @maildate

 

Final Thought

I hadn’t taken sufficient care in ensuring that old data was kept clean. In fact I hadn’t even realized that backup history and old mail items could be a concern. Since I found this today I’ve started going through and performing this cleanup action on each of my SQL instances. Once that’s done I’ll be using PowerShell to deploy the nightly cleanup job on every instance.

I highly recommend checking your own servers to see if you might be encountering the same problems.