jump to navigation

SQL Azure in. Is my job out? October 30, 2009

Posted by msrviking in General.
add a comment

I have been mostly with database from the day one I had be in IT field. I worked on MS Access 97 to SQL Server 2008 until date, and wrote millions of lines of code, administered and managed databases in all environments, designed systems to provide availability, scalability and so on. So people know me as a database guy (not to boast about!) in my organization and a lighter side – my lunch box looks like a cylinder – pictorially as a database 😉 so you know now what am I talking about.

All this is momentary where people look at me for any database stuff, but some kind people who really don’t know what I am thinking when all this said around, try testing me asking what will be state of your job if  SQL Azure is going to go full cylinders and we may have loss of business and that means my job is on stake in future.  This question was asked by one of my team mates (an application technical architect), and back was the answer from me spontaneously. I replied back – What does it matter? We (DB guys) shall upgrade our knowledge on technology but the base will remain same and here is what I meant to say the “base” – as long as I (DBA’s) know how to store data, query data and protect data come whatever technology it could be a easy job to catch with technology.

Similar thoughts are reflected in Buck’s post here. It’s very coincident and my thanks to Buck for helping me to think much clearer after reading this through. Do visit the post and feel comfortable about your future.



DB Designing? Objective /Goal October 30, 2009

Posted by msrviking in DBA Rant.
add a comment

One of my shop mate came up to me and said – “Hi! I did a database design for one of the projects. I need to set my goal so that my project supervisor assesses me at the end of the exercise.”!

Well, my first thoughts were like what is a non-db guy doing in a database design. I am definitely not comparing myself with him nor with anyone but it surprises me sometimes when a technical (especially application dev leads) go ahead and design tables. In my thoughts I definitely don’t have any aversion towards the apps team or “great” tech leads, but in my shop its like “I create the mess! and then someone reviews and fixes.” Somehow, I don’t always appreciate the idea of  having app team design database until and unless this guy is definitely lot experienced in db design along with application development. At the EOD, its we who (DB guys) clean up the created mess. Oh, I am cribbing a lot again and always I express all these over my posts but my intent is to convey what is good, what is bad in DB process.

After a little dodging from this guy I decided to put in thoughts over what he had asked (Goal at the end of the activity). I am going to put what I had shared with him, and I would want you guys too to share your thoughts.

All these are from years of experience, so feel free to comment.

– Understand requirements through case studies, prototypes, documentation

– Map the requirements per module to a specific subject area (application modules, reporting, archiving, auditing, and any other few non-functional requirements)

– Identify key entities with high level attributes and prepare a conceptual data model

– Transform a conceptual data model to an E-R diagram with entities, attributes (key & non-key), relationships (1:N, N:1, M:N)

– Transform a logical data model into 3NF (3rd normalized form)

– Walkthrough the business analysts, technical team (application development & DB team), business team through the data model

– Identify the gaps during the walkthrough sessions and make changes to logical data model accordingly

– Transform an evolving logical data model to physical data model

– Review physical data model with Database Administrator to identify the type of objects that needs to be created in the DB instance (tables, triggers, data types, constraints [unique, foreign key, null able], indexes)

– Provide scripts to Database Administrator for creating tables and other objects appropriately in the table

– Provide guidelines in the volumes of data that could be generated for each of the table and plan for storage along with the DBA

– Ensure the scripts provided for object creations are appropriate per understanding with the app team & DB team

– Evolve data model as when requirements keep changing (new /adding)

– Ensure that the data model conforms to normalization process per normalization rules

– Ensure that the data model is flexible for changes in future and is not rigid which would cause overhead in future

– Ensure the process of synchronizing between the logical data model, the physical data model and the physical objects are smooth

– Provide guidelines to the team on the use of the entities, attributes, relationships from a logical data model perspective


SQL Server DBA? October 29, 2009

Posted by msrviking in DBA Rant.
add a comment

I had an assignment around 2 weeks ago which was one of the toughest I could list out. One of the client had performance issue in one of the boxes and weren’t able to identify what could be the cause of the issue.

Someone from the DBA Group (where I don’t belong) had been identified for working on this – Identify the problem & provide recommendations. I was asked to guide this gentleman while he is on the job! Well, it just happens sometimes that I have to spend my energy thinking for my own activities on my table and also for others. And this is part of my job and I definitely won’t gripe about it. I told myself  “I need to help this guy just the way I am going to help me”.

All was set right to take off, but the only thing which missed from the beginning until the end of the exercise was that the DBA was not communicating. I did follow ups, mails, calls (short guidance’s), visited workplace  of  the DBA checked if we were in right directions and so on. All effort was on, and finally after the results were obtained from the monitoring scripts, the analysis phase started off.  The same communication problem was still existing during this stage too, and I was not knowing at all of what was happening at the other end.

One fine evening I get a recommendation list to be evaluated and validate after a single line follow up. The next day I spent 3 hours trying to collate information after large logs and trying validate. In vain!, I couldn’t complete and the presentation was scheduled the next day :o.  I gave up at the end giving my high level thoughts and as expected all these backfired where the client wasn’t happy with what was done.

And then one day I was asked to understand everything, analyze, recommend and present my findings. It was hell-of-days (5 days continuously) running through large logs (> 1.5 G), traces, performance counters and trying tie several pieces of information. A crazy time but I didn’t crib at all. At the end I could present findings and recommendations and I could save the face of the activity called as “DB Performance Analysis”. But one learning – Be whatever you are (multi-certified, well recognized), but things wouldn’t work if you don’t communicate. Communication is so important for any activity in life (work /personal). So guys if you such kind of situation I would say to help the other guy in communicating else you will be stuck as I was.

I hope this experience of mine helps anyone else out there.


Reports subscription in SSRS 2008 October 29, 2009

Posted by msrviking in Configuration.

I had a request from my clients asking “Can a user subscribe to reports all by his own?”. The quick answer I said “Yes”, and I did a POC to find how it could be done and what are the limitations.

After the POC I decided to document which I shared with the client team. I thought I should share with you all on what it is all about! Although the content isn’t exhaustive but probably could help for a initial startup.


Scheduling a report in Report Server (SSRS 2008)


A subscription of a report is an on-demand reporting which could be scheduled and the delivery of the report can be automated. A subscription is processed in the report server and the delivered report can be shared in a folder of a file server of can be sent to email addresses.

A subscribed report uses stored credentials, and the user wanting to create subscription should have permission to view the report and also creating individual subscriptions. As part of configuration at reporting server level, scheduled events and report delivery (e-mail delivery should be configured separately) should be enabled. Additional delivery extensions can also be added by installing developed custom extensions.

Types of subscriptions:

–          Standard subscriptions are created and managed by individual users. A standard subscription consists of static values that cannot be varied during subscription processing. For each standard subscription, there is exactly one set of report presentation options, delivery options, and report parameters.

–          Data-driven subscriptions get subscription information at run time by querying an external data source that provides values used to specify a recipient, report parameters, or application format. These types of subscriptions are typically created and managed by Report Server administrators.

The limitations of using data-driven subscriptions are as following,

  • Data-driven subscription functionality is not available in Standard Edition.
  • For subscription data, choose a data source that can provide schema information to the report server. The supported data source types include SQL Server, Oracle, Analysis Services databases, SQL Server Integration Services package data, ODBC data sources, and OLE DB data sources.

Permissions for subscriptions:

Users can subscribe to reports through two tasks described as below.

–          The “Manage individual subscriptions” tasks will allow creating, modifying, and deleting subscriptions that are owned by a user for a specific report. This task is part of the Browser and Report builder predefined roles. Any user assigned to these roles having the above task will the user to manage only those subscriptions that are owned.

–          The other task will allows users to access and modify all subscriptions is “Manage all subscriptions”. This task is for data-driven subscriptions and is part of the predefined role – Content Manager.

Creating standard subscription:

A standard subscription can be created by individual users who want to have a report delivered through e-mail or to a shared folder. A standard subscription is always defined through the report on which it is based.  A user who creates a subscription owns that subscription. Each user can modify or delete the subscriptions that he or she owns.

Pre-requisites /Limitations of Standard subscription

Requirement 1

Permission to view the report which is chosen for subscription, which would mean the user, should be assigned to the role “Manage individual subscriptions”.

Requirement 2

The report must use stored credentials or no credentials to retrieve data at run time. A report can’t be subscribed which is configured to use the impersonated or delegated credentials of the current user to connect to an external data source. The stored credentials can be a Windows account or a database user account.

Requirement 3

If the model is used as a data source for a report contain security settings then this report can’t be subscribed.

Requirement 4

If the report requires a parameter to run during the processing time then the input parameter should be defined while scheduling the subscription.

Subscription creation

Once the subscription is created, a SQL Agent job with a system id is created with particulars like

–          Job name

–          Job description

–          Job schedule (day, hour, frequency)

–          Job running account – Local Service

–          Transact SQL referring to an event which will trigger the schedule process

When the job runs successfully an e-mail is sent and /or the scheduled report is put in the shared folder of the file server for later access.

Other details:

Subscriptions to reports create specific schedules that are defined through subscription properties, but shared schedules are easier to manage and maintain for the following reasons:

  • Shared schedules can be managed from a central location, making it easier to compare schedule properties, adjust frequencies and recurrence patterns.
  • If shared schedules are used, when scheduled operations occur is precisely known. This makes it easier to anticipate and accommodate server loads before performance issues occur.


Let me know what you guys think?



Trace skipped records – SQL Server profiler October 28, 2009

Posted by msrviking in Performance tuning.
add a comment

Recently I was working on performance analysis of a “mystery monster” stored procedure in one of the projects at my shop. I was working on this stored procedure in the usual way I try to pick up pain points of stored procedurs /sql statements. And I decided to run SQL Server profiler to profile the performance of the stored procedure. All was done and profiler ran fine and I could see lot of information captured although not overwhelming. I took a deep breath to understand 2000 lines of profiled data, and as I was scanning through I saw something like this and have never seen it before.

Here is the snapshot which I had in the profiler,


Trace Skipped records

Profiler results


I thought this is something new which I haven’t seen before and took it as learning (partly ignorance too 😦 ).  But, it is so coincident that someone, somewhere had the same problem and this is what I appreciate about sharing information across the globe.  That someone is ScaryDBA  who had the same issue, but he went and dug around on to see what is this all about?  Well here is what he had to say in his blog post – Snags with Profiler GUI.

Learnings – Don’t be ignorant like me and dig around if you find something new else it will be half-knowledge.

Thanks to ScaryDBA for his thoughts!



Free SQL Server tutorial October 28, 2009

Posted by msrviking in Training.
add a comment

I am sure whoever are reading my blog do know about SQL Server well. However, I am sure there will be newbie’s around who will want to kick-start off on SQL Server and for them for others too (including me), it makes worthwhile to read and learn the content from this website http://midnightdba.itbookworm.com/.

Thanks to Sean!


Ctrl+Shift+M – SQL template October 28, 2009

Posted by msrviking in Tips & Tricks.
add a comment

I was doing a regular reading through my favorite blogs and I learnt something new early in the morning. Here is what it is all about.

Being on development side of the applications I always recommend developers, leads and everyone who wants to write t-sql code to use standard templates. How could we achieve this quickly?

On opening your SSMS instead of a blank window, you could get your template by using Ctrl+Shift+M, and you could customize your template by replacing the content in the file or replacing the file itself but with same name in the following path

C:\Program Files\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\SqlWorkbenchProjectItems\Sql

The file you will want to edit to standardize the template will be Sqlfile.SQL.

I got this tip from my Buck Woody. Thanks Buck for sharing info on such small but important things.