Friday, 22 April 2011

How to use MEF with C#

MEF stands for Managed Extensibilty Framework and allows C# developers to developer plug-in architectures for their solutions.

This allows any C# project (WPF, Silverlight, ASP.NET, Winforms, Class Library, etc) to make use of external functionality dynamically at runtime without the need for a hard coded reference.  This is made possible through the use of interfaces and MEF with it’s Import and Export attributes and only takes a few lines of code.

MEF is part of the .NET Framework v4.0 and is being used in future versions of Visual Studio to provide their plug-in model.

Architecture

In our demo the host application, in this instance a console app, will load two plug-ins to provide CalculationServices.  Both will implement a Calculate method accepting two integers and returning an integer.  The first CalculationService will add the two numbers together and the second CalculationService plug-in will multiple the numbers together.

architecture

Coding the Demo

First create a Console application called MefSimpleDemo.exe then add three class libraries to the solution:

  SharedContracts (for our interfaces)

  CalculatorService1 (plug-in)

  CalculatorService2 (plug-in).

The Console Application and SharedContracts projects will need to reference MEF and the our Interfaces so right clcik and add the two references for each project.

MEF is stored within the System.ComponentModel.Composition framework (System.ComponentModel.Composition.DLL).

Add-Mef-Reference

add-ref-shared

In our example the plug-ins will be loaded from a known directory location.  In production you would load this location from a config but for this demo we’ll be hardcoding the location (@”../plugins” folder) in the host and building the output of the plug-ins to the same folder.  To alter the output right click on view properties on both plugins click the “Build” tab and update the “Output Path”:

output-to-plugins

SharedContracts.DLL Class Library

Remove the default Class1.cs file, add a file called ICalculationService.cs and paste in the following code:

NOTE: Notice the (MEF) InheritedExport attribute on the interface, this allows MEF to discover the class within the plugin.

CalculationService1.DLL Class Library


Remove the default Class1.cs file, add a file called Addition.cs and paste in the following code:

 

CalculationService2.DLL Class Library

Remove the default Class1.cs file, add a file called Multiply.cs and paste in the following code:

MefSimpleDemo.EXE Console Application

Add a new file called PluginRepository.cs and paste in the following code:

Finally replace the text in the Program.cs file with the following code:


Your solution should now look like this, notice I’ve removed all redundant references from each of the projects:

Solution-explorer


You are now ready to run the application, press F5 and check out your results:

f5


You can download the source code here: 

http://stevenhollidge.com/blog-source-code/MefSimpleDemo.zip

How to use Dapper & PetaPoco, Micro ORMs

With Microsoft now firmly pushing Entity Framework (currently at version 4) as their preferred data access solution, it’s refreshing to see a couple of new players on the scene albeit perhaps not in direct competition.

Dapper (Dynamic Query Mapper) and PetaPoco are Mirco ORMs.  An ORM is an Object Relational Mapper which sit between your business logic and database to abstract away the direct link between the two.  Micro ORMs are a simplier and more efficient option to larger ORM frameworks such as EF4 and NHibernate.

In this demo (full source code downloadable from the links at the end of this article) I’ll be giving examples of how to use each of the frameworks against a SQL Server database for your CRUD (create/insert, read/select, update, delete) commands.  I’m using the tempdb database so there' is no need to run any database scripts.

It’s important to note that for performance you still get a marginal benefit creating a hand coded DAL with SQLConnection, SQLCommand and all those SQLParameter objects like this:

public Customer Select()
{
var customer = null;

using (var connection = SqlHelper.GetOpenConnection())
{
using (var command = new SqlCommand(SqlHelper.ReadStatement, connection)
{ CommandType = CommandType.Text })
{
command.Parameters.Add("@Id", SqlDbType.Int);
command.Parameters[0].Value = SqlHelper.TestId;

var reader = command.ExecuteReader();

while (reader.Read())
{
customer = new Customer()
{
CustomerId = reader.GetString(0),
CompanyName = reader.GetString(1)
};

}
}
}

return customer;
}

But to hand code your own entire DAL is pretty time consuming and hard to maintain particularly when you’ve got commands with plenty of parameters.  Now you could use T4 to generate your DAL based on your database schema and that’s probably still your best bet if performance is everything to your application.  Or, for a minimal perf overhead (most LOB apps wouldn’t notice) you could use one of our new Micro ORMs!


To carry out the same task as the hand coded DAL above, Dapper requires only two lines of code:

var mapperConnection = SqlHelper.GetOpenConnection();
mapperConnection.Query<Customer>(SqlHelper.ReadStatement, new { Id = SqlHelper.TestId });

Note: This is using the (customer) typed Query, you can also use a dynamic version that omits the generic parameter of Customer to return a Dynamic Object.  Dynamic Objects were introduced in .NET Framework 4.

For PetaPoco only three lines of code are required:

var petaPoco = new PetaPoco.Database(SqlHelper.ConnectionString, "System.Data.SqlClient");
petaPoco.OpenSharedConnection();
var customer = petaPoco.Fetch<Customer>(SqlHelper.ReadStatement, SqlHelper.TestId);

Note: This example is using the default execution options, you can also tweak the options to boost performance shown later in this blog.


You can see from a RAD point of view Micro ORMs really come into their own compared to hand coding your DAL.  Since the start of .NET plenty of frameworks have been created to generate your Sql commands and parameters (based on your DAL method attributes or parameters) but these Micro ORM frameworks are special. 


Firstly, other than one file for each framework no other code is required to pollute your solution.  No attributes clogging up your DAL or DTOs or Business Layers.  But now for the really special bit, both frameworks are actually generating code dynamically using Dynamic Methods and ILGenerators which is pretty cool and inspiring stuff.  The source code for both frameworks are included in the source code download to this blog post and I encourage you to take a look – kudos to the guys that put these frameworks together!


In terms of performance, the full Dapper project (link also at the end of the post) includes a test project that runs a simple select statement 500 times and compares performance against various data access solutions, the results of which can be seen here:


Dapper-perf-test-results


Note: Dapper is named as Mapper Query when using the typed/generic read method and Dynamic Mapper Query when using the non generic method.


2nd Note: This code was produced on a travel netbook (I’m writing this blog post whilst stuck in Delhi airport for the night being eaten alive by mosquitos :) ).  I would expect the figures to be much faster on a newer PC or laptop.


Examples for C, R, U and D code



Dapper

private static void DapperExamples()
{
Console.WriteLine("Running Dapper examples");

SqlConnection mapperConnection = SqlHelper.GetOpenConnection();

// CREATE
mapperConnection.Execute(SqlHelper.CreateStatement,
new
{
Id = SqlHelper.TestId,
CompanyName = SqlHelper.InsertCompanyName
});

// READ Mapper Query
IEnumerable<Customer> customer1 =
mapperConnection.Query<Customer>(SqlHelper.ReadStatement, new {Id = SqlHelper.TestId});

// READ Dynamic Object Mapper Query
var customer2 = mapperConnection.Query(SqlHelper.ReadStatement, new {Id = SqlHelper.TestId});

// UPDATE
mapperConnection.Execute(SqlHelper.UpdateStatement,
new
{
Id = SqlHelper.TestId,
CompanyName = SqlHelper.UpdateCompanyName
});

// DELETE
mapperConnection.Execute(SqlHelper.DeleteStatement, new {Id = SqlHelper.TestId});
}

PetaPoco

private static void PetaPocoExamples()
{
Console.WriteLine("Running PetaPoco examples");

var petapoco = new Database(SqlHelper.ConnectionString, "System.Data.SqlClient");
petapoco.OpenSharedConnection();

// CREATE
petapoco.Execute(SqlHelper.CreateStatementWithIndexedParams, SqlHelper.TestId, SqlHelper.InsertCompanyName);

// READ with all default options
List<Customer> customer1 = petapoco.Fetch<Customer>(SqlHelper.ReadStatementWithIndexedParams,
SqlHelper.TestId);

// READ with some "smart" functionality disabled
var petapocoFast = new Database(SqlHelper.ConnectionString, "System.Data.SqlClient");
petapocoFast.OpenSharedConnection();
petapocoFast.EnableAutoSelect = false;
petapocoFast.EnableNamedParams = false;
petapocoFast.ForceDateTimesToUtc = false;
List<Customer> customer2 = petapocoFast.Fetch<Customer>(SqlHelper.ReadStatementWithIndexedParams,
SqlHelper.TestId);

// UPDATE
petapoco.Execute(SqlHelper.UpdateStatementWithIndexedParams, SqlHelper.TestId, SqlHelper.UpdateCompanyName);

// DELETE
petapoco.Execute(SqlHelper.DeleteStatementWithIndexedParams, SqlHelper.TestId);
}

SqlHelper class

internal static class SqlHelper
{
public static readonly string ConnectionString =
@"Data Source=.;Initial Catalog=tempdb;Integrated Security=True";

public static readonly string CreateStatement =
@"INSERT dbo.Customers (CustomerID, CompanyName) SELECT @id, @companyName";

public static readonly string ReadStatement = @"SELECT * FROM dbo.Customers WHERE CustomerId = @id";

public static readonly string UpdateStatement =
@"UPDATE dbo.Customers SET CompanyName = @companyName WHERE CustomerId = @id";

public static readonly string DeleteStatement = @"DELETE FROM dbo.Customers WHERE CustomerId = @id";

// PetaPoco parameters are named based on index
public static readonly string CreateStatementWithIndexedParams =
@"INSERT dbo.Customers (CustomerID, CompanyName) SELECT @0, @1";

public static readonly string ReadStatementWithIndexedParams =
@"SELECT * FROM dbo.Customers WHERE CustomerId = @0";

public static readonly string UpdateStatementWithIndexedParams =
@"UPDATE dbo.Customers SET CompanyName = @1 WHERE CustomerId = @0";

public static readonly string DeleteStatementWithIndexedParams =
@"DELETE FROM dbo.Customers WHERE CustomerId = @0";


public static readonly string TestId = "TEST";
public static readonly string InsertCompanyName = "Inserted company";
public static readonly string UpdateCompanyName = "Updated company";

internal static void EnsureDbSetup()
{
using (var cnn = GetOpenConnection())
{
var cmd = cnn.CreateCommand();
cmd.CommandText = @"
if (OBJECT_ID('Customers') is null)
begin
CREATE TABLE [dbo].[Customers](
[CustomerID] [nchar](5) NOT NULL,
[CompanyName] [nvarchar](40) NOT NULL,
[ContactName] [nvarchar](30) NULL,
[ContactTitle] [nvarchar](30) NULL,
[Address] [nvarchar](60) NULL,
[City] [nvarchar](15) NULL,
[Region] [nvarchar](15) NULL,
[PostalCode] [nvarchar](10) NULL,
[Country] [nvarchar](15) NULL,
[Phone] [nvarchar](24) NULL,
[Fax] [nvarchar](24) NULL)
end
";
cmd.Connection = cnn;
cmd.ExecuteNonQuery();
}
}

public static SqlConnection GetOpenConnection()
{
var connection = new SqlConnection(ConnectionString);
connection.Open();
return connection;
}
}

SQL Profiler


Using SQL Profiler we can view the TSQL that is being sent by Dapper and PetaPoco:


ScreenShot019


I hope you find this article helpful, happy coding!


Downloads


Blog source code (including source for Dapper and PetaPoco): http://www.stevenhollidge.com/blog-source-code/MicroOrmDemo.zip


Dapper (includes performance tests comparing Data Access solutions):  http://code.google.com/p/dapper-dot-net/


PetaPoco:  http://www.toptensoftware.com/petapoco/

Tuesday, 19 April 2011

Export and Import Xml using TSQL

For this example we’ll be using our old favourite the Microsoft sample Northwind database.

You can download the Northwind database installer here:

http://www.microsoft.com/downloads/en/details.aspx?familyid=06616212-0356-46a0-8da2-eebc53a68034&displaylang=en

Exporting Xml using TSQL

We’ll be using xp_cmdshell and BCP to export the data so you’ll need to configure your Sql Server instance to allow you to run the command.  You can do that by running the following script:



To start with I’ll show the data we plan to export in a standard denormalised format from a Sql query:



Which returns us the data in the following format (for now I’ll only show the top of the results):



sql-data



To return the same data in Xml that is nicely formatted and nested we can run the following TSQL statement:



Which gives us our Xml representation (again, I’ll only show the top of the results):



xml-data



To export our data in Xml format we can run the following script:



Notice in the BCP statement we are using the –w parameter to indicate we want the output in Unicode.  In SQL 2008 R2 to boost performance we can change this to use the Native SQL format but more on that later.



Note: You’ll need to ensure you’re Sql Server instance is configured correctly with the appropriate permissions assigned set to run the xp_cmdshell and BCP statement, with access to the filesystem.



You can open our new C:\data.xml file in Notepad or IE and see we have nicely formatted xml file:



xml-in-ie



Importing Xml using TSQL



To import the data back into SQL you can use the following statement:



Which results in:



import-xml



SQL 2008 R2 Native Format Support



For improved performance in SQL 2008 R2 you can alter the scripts above to utilise Native format support.



For export if you change the BCP statement in our export script above to use –N instead of –w:



You won’t be able to view the data in a text editor as it’s now in a SQL encoding but you’ll get much improved performance.



data-xml-in-native-format



For import you can change the script to reference the data file type of ‘widenative’:



More information can be found about Using Native Format for Import and Export here:



http://msdn.microsoft.com/en-us/library/ms191232.aspx



I hope you found this blog post useful, happy coding!

Sunday, 17 April 2011

Knockout MVVM JavaScript UI framework

From Mix 2011, Steve Sanderson delivers KnockoutJS in this lightening talk introducing Knockout.JS.

Learn how the Knockout library builds on advanced jQuery and JavaScript techniques to make even the most complex data-filled HTML forms a breeze. We’ll see jQuery, jQuery templating, JSON and live data banding applied wto the MVVM pattern with Knockout, combined with ASP.NET to produce results that need to be seen to believed.

Highly recommended, within 5 minutes you’ll understand the power of Knockout.JS:

The view or download the video from the Microsoft website click here: 

http://channel9.msdn.com/Events/MIX/MIX11/FRM08

Introduction to Gibraltar for C# Log Analysis

Gibraltar monitors .Net application to record errors, events, trace statements and performance metrics then lets you effortlessly analyse the results.

gibrgib

There are three parts to Gibraltar:

Agent

The Agent is a DLL (Gibraltar.Agent.Dll) referenced by your .Net application that records the errors, events and metrics of your app.  It’s really simple to add to your project and can be configured automatically via a wizard or manually by editing an Xml file or programmatically via the Gibraltar API.

Hub

The Hub serves as a delivery mechanism for transporting the results from each Agent to the Analyst.

Analyst

The Analyst is a GUI application that lets developers view and analyse results through drilldowns, filtering and charts.

Some examples of using the Analyst can be seen here:

Analyst-example-1

It’s very simple to drilldown to specific targetted areas using grouping and filtering by categories, namespace/class hierarchies, thread id, severity, etc.

Analyst-example-2

You can see here that the Analyst shows you the source code that has generated the message.

I highly recommend the following video from Gibraltar Software that gives and excellent overview of Gibraltar.  The video also gives an example of teaming the software up with PostSharp, the popular AOP (Aspect Oriented Programming) framework:

Using PostSharp with Gibraltar

To download trial versions of Gibraltar and PostSharp please visit the official websites:

http://www.gibraltarsoftware.com/

http://www.sharpcrafters.com/postsharp