Tuesday 3 March 2015

Entity Framework 6



Error -
Code generated using the T4 templates for Database First and Model First development may not work correctly if used in Code First mode. To continue using Database First or Model First ensure that the Entity Framework connection string is specified in the config file of executing application. To use these classes, that were generated from Database First or Model First, with Code First add any additional configuration using attributes or the DbModelBuilder API and then remove the code that throws this exception

Reason - normal connection string is used rather than the one required by EF where it needs information about metadata like .csdl, ssdl, msl 

Wednesday 23 May 2012

Test Driven Development: You are going to shoot me now but be PRAGMATIC


Test driven development (TDD) is more than just writing unit test cases to make to us happy when the progress bar goes red to green. It promotes writing clean code and demotes unnecessary documentation. Code refactoring enables clean code - helps to find the simplest design of your class. Test cases written against your class in itself describe intention. Why do you need to document class description unnecessarily. 

No need to write test cases for all the classes:
Having said that we need to be pragmatic. I know not all of you agree with me - there is no need to write test cases for all the classes  you will be implementing! You are not going to shoot be now yet :) I think the better approach would be to identify classes that are responsible for implementing business rules. Business rules often change thus requires changes in classes. The beauty of TDD comes into play now. Your hard work starts to pay you off. You can run test cases you have written which automatically there and then warns you if new changes has broken other part of system or not. Bingo - regression testing is so sweet now.

Mock or not to Mock:
Mocking framework assists in situation when your class depends on external services - database calls, file system manipulation, invoking web services etc. Fundamental idea is that your code is working fine as long as the value you are expecting from dependent system is correct! It helps to satisfy TDD principle - unit test case shouldn't take more than 10 millisecond to execute. More over continuous build shouldn't take more than 10 minutes to generate a new build. All sounds good. My worry is that we are given a false security. I have to be bold. Therefore if possible avoid mocking. Don't shoot me please. I am trying to be practical and just want to make sure my system is working as it should. As an example, imagine you have written data access repository classes. How do you say that it's meeting business requirements unless you have written test cases against those classes allowing to hit the database? Am I not asking a valid question? If you are on the same boat as I am then I follow simple rule - reset database to known state, run your test cases and wait for the bar to go green!

[TestFixture]
public class DbTest
{
      [SetUp]
public void ResetDatabaseToKnownState()
{....}

      [TearDown]
public void CleanUpYourDirt()
{...}

[Test]
        public void DoTestHittingDb()
{..}
}

Thanks,
Milan Gurung

Tuesday 15 May 2012

Failing to "Create New Complex Type" when promoting Stored procedure to Function in EF


Stored procedure having temporary table declaration needs to be replaced with  TABLE variable declaration.

e.g.

CREATE Procedure spDoSomethingAmazing
AS
BEGIN
CREATE TABLE #Temp (ID INT, Name VARCHAR(50)

INSERT INTO #Temp
SELECT
ID, Name
FROM
TestTable

SELECT ID, Name #Temp

DROP TABLE #Temp
END

-- Modified version of Stored procedure so Entity Framework can generate Complex Type

CREATE Procedure spDoSomethingAmazingForEF
AS

BEGIN
DECLARE @Temp TABLE (ID INT, Name VARCHAR(50)

INSERT INTO @Temp
SELECT
ID, Name
FROM
TestTable

SELECT ID, Name @Temp

END

Tuesday 1 May 2012

SqlBulkCopy – a hidden gem in ADO.NET library


Developers implementing data-driven applications are very much familiar with ADO.NET library. Those working closely with SQL Server day-in day-out know about classes like Dataset, SqlDataReader, SqlConnection, SqlCommand etc.   Have you heard to SqlBulkCopy available in System.Data namespace since .NET Framework 2.0? Perhaps not all of us. If you haven’t then I am sure you will be tempted to use it in your next killer application or maybe you might refactor your existing applications.

SqlBulkCopy:  As a developer we always want to improve application either by making UI better or by loading data quicker or doing some other clever stuffs to satisfy broad range of end users - continuous improvement (Kaizen) is the key. If there is a need to import large dataset containing millions of records, you have many options. I have listed two of most widely used methods.

1. Read source file line by line and keep on dumping records into database. This is very time consuming.
2. Create SSIS package to import. This is fastest way and it works fine as long as source file maintains header information same always.

Unfortunately,  source file header information gets modified for different reasons. This change shouldn't break your data import process. Here comes SqlBulkCopy class handy. Column mapping feature allows map source column header to correct destination table column. It notifies number of records being inserted thus it is possible to update UI progress bar accordingly. Data loading is amazingly quick. It drastically reduces time taken to load data into database. Personal experience is that source file containing half a million records could be dumped in couple of a seconds which would otherwise take a few minutes using method 1 outlined above.

How does the SqlBulkCopy class looks? Please refer MSDN

Thanks,
Milan Gurung



Tuesday 24 April 2012

Offshore to Onshore software development experience:

More than half of my professional career so far as a software developer involved implementing bespoke systems for various industries.  I spent my early days in offshore software development house writing small to fairly medium size windows based applications mainly for customers based in US.  Some of the smaller contracts were delivered within a week engaging a single developer whereas larger ones took us many months (8-15) with 4-5 developers working in a team practicing SCRUM principles. It was a fun. Real fun! Looking back it feels so good. I am grateful to my back then employer (HimalayanTechies).  I was given “licence to learn” freedom to play with state of art technologies. They were very supportive throughout my time with them. The best part was you are the one to decide which development tools and platforms to choose for you next project.
Moving on from Offshore to Onshore software development experience, it has been quite nice to be honest. I am not quite sure whether the term “onshore development” is the correct one to refer to someone developing software for customers with whom they are within easily commutable distance -more importantly in the same time zone. New culture, new lifestyle and far away from your friends you have grown-up with. So on and so forth. Everything is new. Once you get used to with the new environment you start to feel comfortable. Better news is that you are not new to your s/w development methodologies, principles and the best practices. It’s the same. It might have been slightly amended to fit each organisation’s working practices and interpretations. That’s alright. It shouldn’t scare anyone.  Leaving the financial side to smart people, working Onshore is quite exciting for a developer.  Working in the same time zone, understanding the culture and current market trends obviously helps you as developer to perceive of customer’s requirements quickly. It is easier to make decisions like choosing UI layout, colour themes and work flow of the business process. In addition, you get the chance to use out of the shelf third party stable commercial components to make your development experience nice and easy (RAD) allowing you to focus on business requirements thus you are in a better position to delivery product in time and within budget and hopefully less buggy systems :) assuming you came from an Offshore development team where affordability of purchasing commercial product is remote. 

Monday 23 April 2012

EF4.x: A dependent property in a ReferentialConstraint is mapped to a store-generated column

Next time, if I see this error again "A dependent property in a ReferentialConstraint is mapped to a store-generated column..." I will look into my Foreign Key constraint in SQL table design rather than looking for a solution within the codebase itself!

If foreign key relationship is pointing to wrong column of the table, you might lose your precise development time and mood.

E.g.
Employee (EmpID [PK], FName, Surname) 
Qualification(QualID [PK], EmpID [FY], QName, GraduationYear)

Careless mistake like - Instead of EmpID of Qualification table, you make QualID having FK relationship with EmpID of Employee table. BOOM!!!

No error during Entity Diagram generation but when you try to insert record into Qualification table with EmpID value that does not exist in Employee table, you will be presented with the above runtime error.


Cheers,
Milan Gurung



Wednesday 18 April 2012

SQL Federation - is it good for existing systems migration to Windows Azure platform?


I have been looking into SQL Federation since the first release. Database sharding is out of the box. You can split your database to multiple smaller databases when it grows. Federated databases maintain throughput thus your application can serve data without degradation.

Million dollar question: Is SQL Federation good for existing systems?

I am in the same boat as most of the other colleagues. I love SQL Federation. Without thinking twice I will embrace it for Greenfield projects in future.

Reasons -

SQL Federation requires radical change in existing database schema structure. It doesn't support IDENTITY column. Developers have to generate their own. Every table structure must have Clustered Index or else data insertion fails :(  Our product line has more than 50 tables. Change in those tables means changes in data access layer. In addition, there is no support for MERGE command yet.

What solution do we have now? 

Rather than using SQL Federation, existing systems will use just SQL Azure database. Perhaps each database for each customer just for the sake of argument.Although this architecture suffers from 'Connection Pool Fragemention', migration to Windows Azure platform would be easier with less modifications to existing codebase and database re-architecture.

'Connection Pool Fragmentation' issue is explained clearly by Cihan in his blog here


Cheers,
Milan Gurung