Wednesday 23 May 2012

Test Driven Development: You are going to shoot me now but be PRAGMATIC


Test driven development (TDD) is more than just writing unit test cases to make to us happy when the progress bar goes red to green. It promotes writing clean code and demotes unnecessary documentation. Code refactoring enables clean code - helps to find the simplest design of your class. Test cases written against your class in itself describe intention. Why do you need to document class description unnecessarily. 

No need to write test cases for all the classes:
Having said that we need to be pragmatic. I know not all of you agree with me - there is no need to write test cases for all the classes  you will be implementing! You are not going to shoot be now yet :) I think the better approach would be to identify classes that are responsible for implementing business rules. Business rules often change thus requires changes in classes. The beauty of TDD comes into play now. Your hard work starts to pay you off. You can run test cases you have written which automatically there and then warns you if new changes has broken other part of system or not. Bingo - regression testing is so sweet now.

Mock or not to Mock:
Mocking framework assists in situation when your class depends on external services - database calls, file system manipulation, invoking web services etc. Fundamental idea is that your code is working fine as long as the value you are expecting from dependent system is correct! It helps to satisfy TDD principle - unit test case shouldn't take more than 10 millisecond to execute. More over continuous build shouldn't take more than 10 minutes to generate a new build. All sounds good. My worry is that we are given a false security. I have to be bold. Therefore if possible avoid mocking. Don't shoot me please. I am trying to be practical and just want to make sure my system is working as it should. As an example, imagine you have written data access repository classes. How do you say that it's meeting business requirements unless you have written test cases against those classes allowing to hit the database? Am I not asking a valid question? If you are on the same boat as I am then I follow simple rule - reset database to known state, run your test cases and wait for the bar to go green!

[TestFixture]
public class DbTest
{
      [SetUp]
public void ResetDatabaseToKnownState()
{....}

      [TearDown]
public void CleanUpYourDirt()
{...}

[Test]
        public void DoTestHittingDb()
{..}
}

Thanks,
Milan Gurung

Tuesday 15 May 2012

Failing to "Create New Complex Type" when promoting Stored procedure to Function in EF


Stored procedure having temporary table declaration needs to be replaced with  TABLE variable declaration.

e.g.

CREATE Procedure spDoSomethingAmazing
AS
BEGIN
CREATE TABLE #Temp (ID INT, Name VARCHAR(50)

INSERT INTO #Temp
SELECT
ID, Name
FROM
TestTable

SELECT ID, Name #Temp

DROP TABLE #Temp
END

-- Modified version of Stored procedure so Entity Framework can generate Complex Type

CREATE Procedure spDoSomethingAmazingForEF
AS

BEGIN
DECLARE @Temp TABLE (ID INT, Name VARCHAR(50)

INSERT INTO @Temp
SELECT
ID, Name
FROM
TestTable

SELECT ID, Name @Temp

END

Tuesday 1 May 2012

SqlBulkCopy – a hidden gem in ADO.NET library


Developers implementing data-driven applications are very much familiar with ADO.NET library. Those working closely with SQL Server day-in day-out know about classes like Dataset, SqlDataReader, SqlConnection, SqlCommand etc.   Have you heard to SqlBulkCopy available in System.Data namespace since .NET Framework 2.0? Perhaps not all of us. If you haven’t then I am sure you will be tempted to use it in your next killer application or maybe you might refactor your existing applications.

SqlBulkCopy:  As a developer we always want to improve application either by making UI better or by loading data quicker or doing some other clever stuffs to satisfy broad range of end users - continuous improvement (Kaizen) is the key. If there is a need to import large dataset containing millions of records, you have many options. I have listed two of most widely used methods.

1. Read source file line by line and keep on dumping records into database. This is very time consuming.
2. Create SSIS package to import. This is fastest way and it works fine as long as source file maintains header information same always.

Unfortunately,  source file header information gets modified for different reasons. This change shouldn't break your data import process. Here comes SqlBulkCopy class handy. Column mapping feature allows map source column header to correct destination table column. It notifies number of records being inserted thus it is possible to update UI progress bar accordingly. Data loading is amazingly quick. It drastically reduces time taken to load data into database. Personal experience is that source file containing half a million records could be dumped in couple of a seconds which would otherwise take a few minutes using method 1 outlined above.

How does the SqlBulkCopy class looks? Please refer MSDN

Thanks,
Milan Gurung