jump to navigation

RIP MCM……my mourning September 4, 2013

Posted by msrviking in Career, DBA Rant.
Tags: , , ,
add a comment

I didn’t want to write on this sudden happening, but then I couldn’t hold my anguish after reading so many posts, articles, and I thought I should share my quota of wailing. Two things broke my decision – an InsiderMail from Paul Randal (@PaulRandal | Blog) on 9/3, and the post from Satya (@SQLMaster | Blog) on 9/3. Both the industry experts came up with their feelings on the sudden happening, coincidentally on the same day.

MCM is discontinued and this is what gave me a shocker, a stunner. This post talks about on how Microsoft has decided to get it off its list. I kept spending few hours reading different articles, posts, news online, news feeds with the key words MCM and realized I am hearing, and actually bearing the brunt of the fact that Microsoft has decided to retire the MCM program. If I have to be honest to myself, I was trying hard to get closer to my dream and goal of obtaining a MCM. Now looks like I wasn’t getting anywhere near, apparently, when something has holding me back – retirement of MCM on 1-Oct-2013. Not a good news at all for folks out there who have probably dreamt like me, and put in more hours to get one or trying to get that ultimate achievement. Microsoft has denied that feeling to all those fans of SQL Server who worked years on the product, by just snubbing with a pull-off-the-plug action, on the program. The death of MCM is going to be quicker than retiring the famous reader – Google Reader. It took Google few months to bring down the service, and Microsoft is going for the death-punch in less than 2 months, huh. A known play as usual, and one without any breather by Microsoft. Nice job, uh.

I could go on ranting my displeasure and disappointment on the news, but then what’s the use of crying for what has happened. The work would get back to usual after few days everywhere, and everyone will visit their priorities. However someone has to keep up the fire and choke the guys who have done this abrupt thing. So go ahead and please, please vote up for a connect item on MSFT website created by Jen Stirrup – a SQL Server MVP protesting and asking back the MCM program.

One last thing – another blogger has shared his displeasure on this post, and made sense to me http://michaelvh.wordpress.com/2013/08/31/microsoft-is-retiring-the-mcsmmca-program/

If you happen to read this post, please share with as many as possible so that there is movement to reverse the decision or probably come up with something better than MCM at least. Don’t forget to vote-up if you don’t want to share this post.

Thanks for reading the bereavement news.

Sad Feeling – Shyam Viking.


SSIS for Extraction & Loading (EL) only September 2, 2013

Posted by msrviking in Business Intelligence, Data Integration, DBA Rant, Design, Heterogeneous, Integration, Integration Services.
Tags: , , , , , , , ,
add a comment

There were series of posts earlier on the different ways to pull data from Oracle to SQL Server where I shared on how to port data either through Replication, CDC for Oracle, and few others like SSIS. I thought I will give a starter to a series of post(s) on how we went about picking up the option of implementing SSIS as a way to bring data from Oracle to SQL Server.

The major drivers (business and non-functional) for picking the solution were

  • The destination database on SQL Server should be replica of the source
  • No business rules or transformation need to be implemented when data is pulled
  • The performance of pull and loading should be optimal with no overheads or bottlenecks either on the source or destination

The destination system was mentioned as ODS, but it surely isn’t the same as Operational Data Store of DWH systems. Sadly the naming convention had been adopted by the shop where we had to implement the solution, and you might see me using the word ODS for sake of explaining the implementation, and I am sorry for putting the incorrect usage of the word. You would probably see little more of conventional or standards being skewed in the implementation, and of course a bit of my comments on how this could have been handled, better.

So the ODS was to hold data that would be used by downstream business analytics applications for the business users, with an intent of providing Self-Service BI. The ODS is to be of exact – in the form of schema, and data as in the source. The job was to pull data from the source without any performance overhead, and failures. Not to forget to mention that there needn’t be any transformation of data. At the end we use SSIS as only an Extraction & Loading tool instead of ETL – Extraction, Transformation and Loading.

This sounds simple, eh, but beware this wasn’t that simple because the design was to be kept simple to address all of the above factors. The SSIS package(s) had to handle these at a non-functional level,

  • The package should be configurable to pick and pull data from any set date
  • The package should be re-runnable from the point of failure
  • The package should have configurability to address performance needs – package execution time should be less, the source should not be overloaded, the destination should be full in its efficiency while data is loading.
  • The package should be configurable to pull data from source in preset chunks. The preset chunks could be based on a period basis – days /months /years, or number of records per period
  • The package should have the option to flag any other dependent packages to run or not run during initial and incremental loads
  • The package should have defensive design to handle bunch of different type of errors
  • The package should have error logging and handling at a package level, and at record levels

In Toto this solution was more to how do we design for the non-functional requirements and implement, leaving the functional side of the data. This type of implementation is half-blind and I will talk more on those cons we had to face, when I get in to details of each implementation.

A post after few weeks of gap, and one with learnings and summaries on how it could have been better.

Data type and conversions December 7, 2012

Posted by msrviking in DBA Rant, Performance tuning, T-SQL.
Tags: , , , ,
add a comment

Today I downloaded SQL Server Data Type Conversion Chart from this location. I remember this chart very well and its available on BOL that would have been installed on your system or on MSDN online.

As soon as I saw the chart, it reminded me days where I used to explain to the development on its importance and always push the developers, leads to have a look on this topic in BOL. Today I am going to tell the teams whoever use SQL Server to download this and stick it around as long as they are on development side of database.

One of the points that I leave with the teams is

“Make sure you don’t have variables declared in your code that cannot match the data types that are designed in the tables. This would mean implicit conversions by query engine and when this happens it could lead to performance problems, deadlocks.”

There had been several situations where performance reviews and fixes ended by just change of variables data types in the T-SQL code. I don’t think I am going to stop sharing this point anytime, anywhere to developers forum whatsoever. There are experienced folks who do this mistake even today, and one can’t get rid of basic mistakes unless its part of practice. Practice has to be diligent, and has to be cultivated.

So if you see anyone writing a piece of code that could hit your performance benchmarks badly, please don’t hesitate to give this piece of advice again.

Let me know what you think.

Cheers and Enjoy!

DB code reviews November 17, 2009

Posted by msrviking in DBA Rant.
add a comment

Sorry guys, I hadn’t been blogging for past 2 weeks. I was little tied with my work at home and office.

Well, here is what I wanted to share with you all. Yesterday one of the project managers had approached me and narrated this.

” I have bunch of developers who are not good at DB coding. Could you review the code and let us know how it looks and before we go into UAT /Production?. This will help us to perceive the probable problems we will face, and we don’t want to correct /rewrite logic in code instead we want to troubleshoot unexpected performance issues, rather”.

This kind of talk always appeases me and I gave my two cents on how we could go about this. Here is what I could tell the manager.

1. I shall review the code, share my comments on the quality of the code so that the team starts working on fixing the wrong sides right away.

2. I also want to sit with the developer and the module lead so that I share the knowledge of how the code has to be written. If we share review comments it will be only as part of statistics, or facts that code was reviewed but the knowledge of writing better code is left out. It is very essential that we “mend” the minds of the developers to write a code which looks optimal as set-based operation and try avoiding row-based operations.

To my surprise I seem to be impressing the project manager and I could sell my thoughts. I wasn’t looking for selling my thoughts, but I was interested in letting the manager know that how important is DB coding, and how we could correct in this project and in any project wherever these developers go.

Coincidentally, on similar lines if not same, Buck Woody has written blog post here which talks about “Jnan! – Knowledge” on how we could avoid post-deployment performance issues.

Enjoy reading!



DB Designing? Objective /Goal October 30, 2009

Posted by msrviking in DBA Rant.
add a comment

One of my shop mate came up to me and said – “Hi! I did a database design for one of the projects. I need to set my goal so that my project supervisor assesses me at the end of the exercise.”!

Well, my first thoughts were like what is a non-db guy doing in a database design. I am definitely not comparing myself with him nor with anyone but it surprises me sometimes when a technical (especially application dev leads) go ahead and design tables. In my thoughts I definitely don’t have any aversion towards the apps team or “great” tech leads, but in my shop its like “I create the mess! and then someone reviews and fixes.” Somehow, I don’t always appreciate the idea of  having app team design database until and unless this guy is definitely lot experienced in db design along with application development. At the EOD, its we who (DB guys) clean up the created mess. Oh, I am cribbing a lot again and always I express all these over my posts but my intent is to convey what is good, what is bad in DB process.

After a little dodging from this guy I decided to put in thoughts over what he had asked (Goal at the end of the activity). I am going to put what I had shared with him, and I would want you guys too to share your thoughts.

All these are from years of experience, so feel free to comment.

– Understand requirements through case studies, prototypes, documentation

– Map the requirements per module to a specific subject area (application modules, reporting, archiving, auditing, and any other few non-functional requirements)

– Identify key entities with high level attributes and prepare a conceptual data model

– Transform a conceptual data model to an E-R diagram with entities, attributes (key & non-key), relationships (1:N, N:1, M:N)

– Transform a logical data model into 3NF (3rd normalized form)

– Walkthrough the business analysts, technical team (application development & DB team), business team through the data model

– Identify the gaps during the walkthrough sessions and make changes to logical data model accordingly

– Transform an evolving logical data model to physical data model

– Review physical data model with Database Administrator to identify the type of objects that needs to be created in the DB instance (tables, triggers, data types, constraints [unique, foreign key, null able], indexes)

– Provide scripts to Database Administrator for creating tables and other objects appropriately in the table

– Provide guidelines in the volumes of data that could be generated for each of the table and plan for storage along with the DBA

– Ensure the scripts provided for object creations are appropriate per understanding with the app team & DB team

– Evolve data model as when requirements keep changing (new /adding)

– Ensure that the data model conforms to normalization process per normalization rules

– Ensure that the data model is flexible for changes in future and is not rigid which would cause overhead in future

– Ensure the process of synchronizing between the logical data model, the physical data model and the physical objects are smooth

– Provide guidelines to the team on the use of the entities, attributes, relationships from a logical data model perspective


SQL Server DBA? October 29, 2009

Posted by msrviking in DBA Rant.
add a comment

I had an assignment around 2 weeks ago which was one of the toughest I could list out. One of the client had performance issue in one of the boxes and weren’t able to identify what could be the cause of the issue.

Someone from the DBA Group (where I don’t belong) had been identified for working on this – Identify the problem & provide recommendations. I was asked to guide this gentleman while he is on the job! Well, it just happens sometimes that I have to spend my energy thinking for my own activities on my table and also for others. And this is part of my job and I definitely won’t gripe about it. I told myself  “I need to help this guy just the way I am going to help me”.

All was set right to take off, but the only thing which missed from the beginning until the end of the exercise was that the DBA was not communicating. I did follow ups, mails, calls (short guidance’s), visited workplace  of  the DBA checked if we were in right directions and so on. All effort was on, and finally after the results were obtained from the monitoring scripts, the analysis phase started off.  The same communication problem was still existing during this stage too, and I was not knowing at all of what was happening at the other end.

One fine evening I get a recommendation list to be evaluated and validate after a single line follow up. The next day I spent 3 hours trying to collate information after large logs and trying validate. In vain!, I couldn’t complete and the presentation was scheduled the next day :o.  I gave up at the end giving my high level thoughts and as expected all these backfired where the client wasn’t happy with what was done.

And then one day I was asked to understand everything, analyze, recommend and present my findings. It was hell-of-days (5 days continuously) running through large logs (> 1.5 G), traces, performance counters and trying tie several pieces of information. A crazy time but I didn’t crib at all. At the end I could present findings and recommendations and I could save the face of the activity called as “DB Performance Analysis”. But one learning – Be whatever you are (multi-certified, well recognized), but things wouldn’t work if you don’t communicate. Communication is so important for any activity in life (work /personal). So guys if you such kind of situation I would say to help the other guy in communicating else you will be stuck as I was.

I hope this experience of mine helps anyone else out there.


Something is common May 7, 2009

Posted by msrviking in DBA Rant.
add a comment

Hi Guys, I was reading through my favorite blog listing today, and I happened to read this blog entry by Linchi Shea. I totally agree with what Linchi says, and Linchi has brought out clear cut differences between a server and a desktop in a simpler way.

I would like to stress to the people who are in development projects that don’t host your databases on a desktop and say you have performance problems. And I hope the project managers, tech leads out there hear this. Please note that I am directing this message to people who really should know the difference.


At a Loss May 5, 2008

Posted by msrviking in DBA Rant.
add a comment

This is how my life goes on too! An interesting article by Sean, and I am a regular reader of his frank blog entries on how a DBA’s life goes on.


Happy reading!