Entity Framework, Technical

In defence of the RDBMS: Development is not slow. You just need better tooling.

I pride myself on being a real world software developer.

Anything that I publish in this blog is a result of a some real world coding. I don’t just play around with small, unrealistic demoware applications. I write real world applications that get used by lots of users.

With experience comes wisdom, and in software part of that wisdom is knowing when to employ a certain technology. There is almost never a right or wrong answer on what technologies you should be using on a project, as there are plenty of factors to consider. They usually include:

  • Is technology x supposed to be the right tool for this job?
  • Do we have the knowledge to work with and become productive in technology x within the scope of the budget?
  • How mature is technology x?

The above questions will even have varying importance depending on where you are currently working. If you are working at an agency, where budgets must be stuck to in order for the company to make a profit, and long term support must be considered, then the last two points from the above have a greater importance.

If you are working in house for a company that has it’s own product, then the last two points become less important.

Just about the only wrong reason would be something like:

  • Is technology x the latest and greatest thing, getting plenty of buzz on podcasts, blogs and twitter?

This doesn’t mean you should be using it on all projects right now. This just means that technology x is something that you ought to look into and make your own assessment (I’ll let you guess where it might be on the technology hype life cycle). But is this project the right place to make that assessment? Perhaps, but it will certainly increase your project’s risk if you think that it is.

One of the technologies that we are hearing more and more about are NoSQL databases. I won’t go into explanation details here, but you should be able to get a good background from this Wikipedia article.

Whilst I have no issues with NoSQL databases, I do take issue with one of the arguments against RDBMS’s – that development against them is slow. I have now seen several blog posts that have argued that developing with a NoSQL databases makes your development faster, which would imply that developing against a traditional RDBMS is slow. This isn’t true. You just need to improve your tooling.

Here’s a snippet from the mongo db “NoSQL Databases Explained” page:

NoSQL databases are built to allow the insertion of data without a predefined schema. That makes it easy to make significant application changes in real-time, without worrying about service interruptions – which means development is faster, code integration is more reliable, and less database administrator time is needed.

Well I don’t know about you guys, but I haven’t had to call upon the services of a DBA for a long time. And nearly all of my applications are backed by a SQL Server database. Snippets like the above totally miss a key thing out – the tooling.

Looking specifically at .net development, the tooling for database development has advanced massively in the last 6 years or so. And that is largely down to the plethora of ORMs that are available. We now no longer need to spend time writing stored procedures and data access code. We now no longer need to manually put together SQL scripts to handle schema changes (migrations). Those are real world huge development time savers – and they come at a much smaller cost as at the core is a well understood technology (unless your development team is packed with NoSQL experts).

Let’s look specifically at the real world use. Most new applications that I create will be built using Entity Framework code first. This gives me object mapping and schema migration with no difficulty.

It also gives me total control over the migration of my application’s database:

  1. I can ask Entity Framework to generate a script of the migrations that need to be run on the database. This can then be run manually against the application, and the application will even warn you at runtime if the schema is missing a migration
  2. I can have my deployment process migrate the database. The Entity Framework team bundle a console app that can be packaged up and called from another process – migrate.exe
  3. I can have my application migrate itself. That’s right – Entity Framework even allows me to run migrations programatically. It’s not exactly hard either.

My point is this: whilst the migrating of a schema may not be something that you would need to do in a NoSQL database (although you would need to handle these changes further up your application’s stack), making schema changes to an RDBMS’s schema just isn’t as costly or painful as is being made out.

Think again before you dismiss good ol’ RDBMS development as slow – because it hasn’t been slow for me for a long time.

Technical, Visual Studio

Using NDepend to clean up code and remove smells

At some point in your development career, you would have had an existing project dumped on you that you will have problems understanding and generally getting around the code. Those difficulties can be the result of some undocumented domain reasons, but could also be because of code smells. The code smells will also make the domain difficult to understand. This, I’m sure, will have been experienced by nearly every developer.

When this happens, the project that you are working on contains a large amount of technical debt. Every new developer on the project loses time trying to navigate their way around the confusing and smelly code. The project becomes infamous within your team and nobody wants to work on it. The code becomes unloved with no real owner. You need to repay some of the technical debt.

In the agile world, we should be refactoring and reviewing code bravely and regularly to improve it’s quality and to reduce the number of code smells. This however can be difficult for a number of reasons:

  • Confidence – “can I really change this code without breaking xy and z section of this application?”
  • Reasoning – “Is renaming this method from “xyz” to “Xyz” really the correct thing to do?”

It’s safe to assume that nobody is going to be completely sure of the above to questions in all circumstances. This is why it’s becoming more and more common to use a code analysis tool to help you find any potential code smells, and advise you on how to fix them. A code analysis tool can be that advisor telling you what you can do to reduce your technical debt. It can also stop you from racking up a technical debt in the first place.

In this post, I’ll be exploring NDepend, a powerful static code analysis tool that can highlight any smells in your code, and give me some good metrics about my code. I’ll be running it against the latest a version of NerdDinner, which can be downloaded from here.

You can run through this walk-through as you are reading this post with NDepend. You can find installation instructions here.

NDepend examines compiled code to help you find any potential issues in the generated IL code – so if you are running NDepend outside of Visual Studio, make sure you build your project first.

nDepend 101 – Red / Yellow / Green code

So let’s start with the basics. One of the coolest things about NDepend is the metrics that you can get at so quickly, without really doing much. After downloading and installing NDepend, you will see a new icon in your Visual Studio notification area indicating that NDepend is installed:

ndependnew

Now, we can fire up NDepend simply by clicking on this new icon and selecting the assemblies that we want to analyse:

ndependattach

We want to see some results straight away, so lets check “Run Analysis Now!”. Go ahead and click ok when you are ready. This will then generate a html page with detailed results of the code analysis. The first time you run NDepend you will be presented with a dialogue advising you to view the NDepend Interactive UI Graph. We’ll get to that in a moment – but first lets just see what NDepend’s default rules thought of NerdDinner:

ndependwarning

Yellow! This means that NerdDinner has actually done ok – we have some warnings but no critical code rule violations. If we had some serious code issues, this icon would turn red. These rules can be customised and new rules can be added, but we’ll cover this later. So we now have a nice quick to view metric about the current state of our code.

This is a really basic measure, but it lets us know that something in our code, in it’s current state, either passes or fails analysis by NDepend. You may be questioning the usefulness of this, but if your team knows that their code must pass analysis by NDepend, a little red / yellow / green icon becomes a useful and quick to see signal. Are my changes good or bad?

Dependency Graph

The Dependency graph allows you to visually see which libraries are reliant on each other. Useful if you want to know what will be effected if you change a library or swap it out for something else (you should be programming against interfaces anyway!):

ndependgraph1

By default, the graph is also showing you which library has the most number of lines of code. The bigger the box, the greater the lines of code. This sizing can also be changed to represent other factors of a library, such as the number of intermediate instructions. This lets you easily visualise information about your codebase.

Queries and Rules

Out of the box, NDepend will check all of your code against a set of pre-defined rules which can be customised. Violations of these rules can be treated as a warning, or as a critical error.

So NerdDinner has thrown up a few warnings from nDepend. Let’s have a look at what these potential code smells are, and see how they can be actioned:

ndepend-queries-and-rules

 

So, within our Code Quality rule group, NerdDinner has thrown up warnings against 3 rules. NDepend’s rules are defined by using a linq query to analyse your code. Let’s take a look at the query to find any methods that are too big:

ndepend-methods-too-big

 

It’s quite self explanatory. We can easily alter this linq query if we want to change our rules – e.g. alter the number of lines of code necessary to trigger the warning. Looking at the results from the query:

ndepend-warning-results

 

NDepend has directed us to 2 methods that violate this rule. We can get to them by double clicking on them, so that we can start re-factoring them. It’s worth stating here that a method that is too big potentially violates the single responsibility principle as it must be doing too much. It needs breaking up.

Using NDepend to enforce future change rules

A stand out feature of NDepend that I haven’t seen anywhere else before is it’s ability to execute rules against changes. I.e, you can tell NDepend to look at two different versions of the same dll, and check that only changes made meet a set of rules. This can be handy in situations where you cannot realistically go back and fix all of the previous warnings and critical violations. Whilst you can’t expect your team to fix the old code smells, you at least expect them to be putting good clean code into the application from now onwards.

Again, NDepend comes with some really good sensible rules for this kind of situation:

ndepend-code-quality-regression

 

Conclusion

Whilst there are plenty of tools out there to help you write clean, maintainable, non smelly code, NDepend does strike as a very powerful and highly customisable tool. It also stands out as it can be run within Visual Studio itself or as a standalone executable. This opens up the potential to it being run on a build server as part of a build process. I certainly have not done NDepend justice in this post as there is heaps more to it, so I would recommend downloading it and running it against that huge scary project that you hate making changes to.

 

Entity Framework, Technical

Running SQL commands with EF Code First

Before ORMs we used to write SQL code.

Yes – real, “bare metal” SQL. We used it for our CRUD operations, and to perform other larger data manipulation tasks. The database server should be the quickest way to find, remove and join data – provided you know what you are doing.

Then we started using ORMs and stopped writing SQL. The advantages of this are that we should have reduced our development time, needed fewer developers with a good knowledge of SQL programming, and didn’t have to write lengthy and repetitive SQL statements (anyone who has worked on or built a data warehouse will fully agree).

But with this, we sacrificed control over what SQL was run against our database server, leaving it to the ORM to decide what to run.

Looking specifically at Entity Framework’s code first, lets take a look at how you can run into problems with a delete.

So here’s the scenario. I have a task that pulls in data from an external source every hour and needs to be “mirrored” into a table in my application’s database.  Let’s call the table BatchImportData.

As I do not own the external data and have absolutely no control over it and need to mirror the data into my application’s database, I need to do the following to get the task accomplished:

  • Delete all of the data in the BatchImportData table
  • Grab the data from the external resource
  • Insert all of the grabbed data into BatchImportData

Using EF code first, I would normally expect to delete all records from the BatchImportData table with the following code:

foreach (var batchImportDataItem in context.BatchImportData)
    {
         Db.BatchImportData.Remove(batchImportDataItem);
    }

This will work, but it will be slow to execute. At the very least, EF will run a delete statement for every single record that exists in BatchImportData.

If we were writing bare metal SQL, we would write either a single delete statement, or a single truncate statement:

DELETE FROM BatchImportData

--OR

TRUNCATE TABLE BatchImportData

We can still do this through EF Code First simply by opening up our DbContext a bit more. Currently, our DbContext will look something like this:

public class DbContext : System.Data.Entity.DbContext, IDbContext
{
    public IDbSet<BatchImportData> BatchImportData { get; set; }
}

Let’s add a public method in our DbContext that exposes System.Data.Entity.DbContext.Database.ExecuteSqlCommand:

public class DbContext : System.Data.Entity.DbContext, IDbContext
{
    public int ExecuteSqlCommand(string sql)
    {
        return base.Database.ExecuteSqlCommand(sql);
    }

    public IDbSet<BatchImportData> BatchImportData { get; set; }
}

This method will take in a SQL statement and will run it against the database.

You can then call the new ExecuteSqlCommand method that you have just added:

   Db.ExecuteSqlCommand("TRUNCATE TABLE BatchImportData");

We now have a much quicker way of removing all records from a table.

Use with caution!

Do not use this if you are going to build up a SQL statement based on user input. You will make yourself susceptible to an injection attack.

This SQL command is merely a string – it is not strongly typed. If we rename our BatchImportData entity and forget to update this SQL command to reflect this change, we will experience a runtime error.

This opens you up to some potential serious data loss mistakes. The classic being a missing where clause.

Azure, MVC, MVC 3, MVC4, Technical

Redirecting legacy pages in asp.net

Picture this situation.

An old (legacy) application has landed on your project pile. It is largely built in php, and you intend on re-writing it in asp.net MVC.

You will therefore need to somehow inform any parties that may be trying to access the old urls ending in .php, that the resource they are looking for has moved permanently. You may also wish to do this for SEO reasons.

This is something that cannot be achieved easily through routing; by default, IIS will not pass requests for resources ending in .php to your application. Your routing will therefore never be put to use for resources ending with .php.

The nicest solution I found to this issue is to setup a list of redirects within the system.Webserver section of your web.config file. The following listing below will send a HTTP Response status of 301 (Moved Permanently) for any requests for index.php and for prices.php:

<system.webServer>
 <httpRedirect enabled ="true" httpResponseStatus="Permanent" exactDestination="true">
 <add wildcard="/prices.php" destination="/prices"/>
 <add wildcard="/index.php" destination="/"/>
 </httpRedirect>
</system.webServer>

Index.php will now redirect to /, and prices.php will now redirect to /prices.

This code is currently running in the wild on Azure.

If you are unsure if you need a Permanent redirect or not, have a read of this article from Google.

Entity Framework, Technical

Creating a composite primary keys in Entity Framework 4.1

There are two main ways of achieving this. Let’s look at an object – Brochure: { ProductId, Year, Month, ProductName}. We want:

  • ProductId
  • Year
  • Month

To make up the primary key.

Method 1 – Data annotations

In your entity class, simply decorate any properties that you want to make up your key with the attribute “Key”:

public class Brochure
{
    [Key, Column(Order = 0)]
    public int ProductId { get; set; }

    [Key, Column(Order = 1)]
    public int Year { get; set; }

    [Key, Column(Order = 3)]
    public int Month { get; set; }

    [Required]
    public string ProductName { get; set; }
}

Method 2 – DbMigration class

NOTE: You shouldn’t need to use this method if you are using full entity framework code first. However, some projects only use entity framework to handle migrations – so this might be of use to you:


public partial class BrochureTable : DbMigration
{
 public override void Up()
 {
 CreateTable("Brochures", c => new
 {
 ProductId = c.Int(nullable: false),
 Year = c.Int(nullable: false),
 Month = c.Int(nullable: false),
 ProductName = c.String(maxLength: 60)
 })
 .PrimaryKey(bu => new {bu.ProductId, bu.Year, bu.Month});
 }
}

Enjoy!

Entity Framework, Technical

Running Entity Framework code first migrations programatically

Entity Framework code first migrations can easily be run programmatically. You can specify a specific migration, or you can just update to the latest migration.

To rollback all migrations (calls the “Down” method on each migration):

var configuration = new Configuration();
 var migrator = new DbMigrator(configuration);
 //Rollback
 migrator.Update();

To rollback or update to a specific migration:

var configuration = new Configuration();
var migrator = new DbMigrator(configuration);
//Update / rollback to "MigrationName"
migrator.Update("MigrationName");

To update to the latest migration:

var configuration = new Configuration();
 var migrator = new DbMigrator(configuration);

//Update database to latest migration
 migrator.Update();

Security, Technical

Making Microsoft Security Essentials behave like antivirus software

I have been running Microsoft’s free antivirus software, Security Essentials, on all of my home machines since it first was released.

On three separate occasions, I’ve discovered Trojans running on machines that are supposed to be protected by this antivirus software.

This freaked me out the first time, concerned me the second time and made me rage quit Security Essentials the third time. I’m now running Bit Defender at home.

I looked into these problems as I could not be the only one facing these issues. I was right – I found several forum posts by people complaining of the same problems.

Upon looking into the issue further, I discovered that there isn’t actually anything wrong with the detection on Security Essentials. In fact, it ranks quite nicely amongst alternative free antivirus solutions (Avast, AVG, Avira).

The problem appears to be its default settings. By default, Security Essentials will be setup to run at 2 am on Sunday, and will only look for an update on virus definitions just before it runs. If, like most home users, your PC may be off at 2 am on a Sunday, these two critical actions will not happen. No update. No scan. This will leave your PC with very little protection.

If you’re going to use Security Essentials, you need to tweak the settings to make it more protective of your PC. Below are my recommended settings. Fire up Security Essentials and navigate to “Settings”.

1. Scheduled Scan

I’d recommend having a daily “Quick Scan” at a time that you know your PC will be on. If you’re worried about the slowdown, simply limit your CPU usage. And remember, the slowdown and downtime that you get as the result of a virus will be a lot worse than any slowdown than you could get as a side effect of an antivirus scan:


SecurityEssentialsSettingsScheduledScan

2. Default Actions

If my antivirus thinks it’s found a severe or high alert, I want it removed:

SecurityEssentialsSettingsDefaultActions

3. Real-time protection

This should be on. If it isn’t, turn it on.

4. Excluded files and locations

Be sensible here. Add any directories and folders that you will be working on regularly that are unlikely to get infected. For example, as a developer, I know that my source code is unlikely to be effected by a virus. As I will be writing changes to these files to the drive regularly, I also do not want any slowdown as a side effect of the antivirus scanning my edited files:

SecurityEssentialsSettingsExcludedLocations

5. Excluded file types

Again, you want to be sensible here and ideally have as few files as possible being scanned here. The default settings of .ini and .log files should be sufficient here.

6. Excluded processes

If you use any heavy applications for work, it is worth adding them into this list. As a developer, I tend to spend a lot of time in Visual Studio. I know this process is a safe one as I installed it and it came from a vendor that I trust:

SecurityEssentialsSettingsExcludedProcesses

7. Advanced

The only change I’d suggest here is setting Security Essentials to scan your removable drives:

SecurityEssentialsSettingsAdvanced

Security Essentials should now be of a greater protective value to you. If you don’t think this will protect you enough, consider purchasing an Anti Virus solution.

MVC 3, MVC4, Razor, Technical

Using MVC4 bundling and minification in an MVC3 project

When you push any site to production, you should at least do some basic front end optimisation. Running through a number of optimisation steps will ultimately make your website load quicker and feel more responsive to the end user.

If this is the first you have heard about front end optimisation, I must recommend that you read “High Performance Websites” by Steve Souders.

One of the best ways to make your site load quicker is to make it smaller. Reduce the amount of data that needs to go from your server to the client machine. And one of the best ways to make massive gains in doing this is to Bundle and Minify the CSS and Javascript that your application uses.

MVC4 has made this incredibly easy with built in Bundling and Minification. However, you do not need to upgrade your entire MVC project to take advantage of this fantastic new set of features.

Firstly, open up the project that you want to optimise. I’ll be demonstrating this with a new empty MVC3 web application. Currently, the head of my _Layout.cshtml file looks like this:


<head>
 <title>@ViewBag.Title</title>
 <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" />
 <script src="@Url.Content("~/Scripts/jquery-1.7.1.js")" type="text/javascript"></script>
 <script src="@Url.Content("~/Scripts/jquery.validate.js")" type="text/javascript"></script>
 <script src="@Url.Content("~/Scripts/jquery.validate.unobtrusive.js")" type="text/javascript"></script>
 <script src="@Url.Content("~/Scripts/modernizr-2.5.3.js")" type="text/javascript"></script>
</head>

So lets run the page and just see what’s going on under the hood:

IndexPage

So at the moment, my page is a massive 355.89 Kilobytes – and all it does is display some text – “Index”. That’s way too big – and we can see that the biggest and slowest loading areas of the site are the Javascript libraries that are being pulled onto every page of the site. (It’s worth noting at this point that you could used the already minified versions of the files. For the sake of this demo, I’ll be using the full fat versions).

So let’s get bundling and minifying.

Open the package manager console, and enter the following command to drag down the Microsoft ASP.NET Web Optimization Framework:

  •  Install-Package Microsoft.AspNet.Web.Optimization -Pre

Once added to your project, you can the begin configuring your bunding and minification. Open your Global.asax.cs file and add a reference to System.Web.Optimization


using System.Web.Optimization;

Then, go ahead and create a new static method, RegisterBundes:


public static void RegisterBundles(BundleCollection bundles)
{

}

Now, lets create some Bundles. I’m going to put all of my Javascript into one minified bundle, and my CSS into another minified bundle:

public static void RegisterBundles(BundleCollection bundles)
{
//CSS
 var styles = new StyleBundle("~/Content/bundledcss").Include("~/Content/site.css");

//JS
 var js = new ScriptBundle("~/Scripts/bundledjs").Include("~/Scripts/jquery-1.7.1.js",
 "~/Content/jquery.validate.js",
 "~/Content/jquery.validate.unobtrusive.js",
 "~/Content/modernizr-2.5.3.js");

bundles.Add(styles);
 bundles.Add(js);
 BundleTable.EnableOptimizations = true;
}

There are three types of bundle that you can create:

  • Bundle – a non-minified bundle of a collection of files
  • StyleBundle – a minified bundle of a collection of css files
  • ScriptBundle – a minified bundle of a collection of Javascript files

Now, lets wire up a call to our new RegisterBundles method. In Global.asax.cs, locate your Application_Start method. Add the following line:

RegisterBundles(BundleTable.Bundles);

Now, when your application starts up, the bundles will be created. Now, all we need to do is to tell the views that they now need to load a bundled file and not the raw, unbundled and unminified css and js files.

In your _layout.cshtml file, or whichever file you have your stylesheets and javascript files referenced, swap out your raw file references. Firstly, add a reference at the top of your view to System.Web.Optimisation:

@using System.Web.Optimization

Now, lets swap the old references to our full fat files out:

 <link rel="stylesheet" type="text/css; href="@BundleTable.Bundles.ResolveBundleUrl("~/Content/bundledcss")" />
 <script src="@BundleTable.Bundles.ResolveBundleUrl("~/Scripts/bundledjs")">"</script>

And that’s us. Let’s test it out. Remember to build first:

IndexPageOptimised

And we’ve made a massive improvement. We have:

  • Reduced the number of requests (from 6 to 3, as we’ve bundled multiple files into one
  • Reduced the file sizes

This won’t take long to bring into an existing site and is definitely worth the benefits.

Edit: Of course, you should also move your Javascript to the foot of the html document for even more speed improvements.

MVC 3, Technical, Unit Testing

Mocking HttpContext (And setting it’s session values)

Thanks to this Stack Overflow answer that pushed me in the right direction, I was able to mock HttpContext and set values that it encompasses.

Firstly, you will need a helper somewhere in your test project that will return you a mock HttpContext:

public static class MockHelpers
{
  public static HttpContext FakeHttpContext()
  {
    var httpRequest = new HttpRequest("", "http://localhost/", "");
    var stringWriter = new StringWriter();
    var httpResponce = new HttpResponse(stringWriter);
    var httpContext = new HttpContext(httpRequest, httpResponce);

    var sessionContainer = new HttpSessionStateContainer("id", new SessionStateItemCollection(), new HttpStaticObjectsCollection(), 10, true, HttpCookieMode.AutoDetect, SessionStateMode.InProc, false);

    SessionStateUtility.AddHttpSessionStateToContext(httpContext, sessionContainer);

    return httpContext;
  }
}

You will then need to Add and Reference System.Web in your test project. Once done, you will be able to set your HttpContext and set HttpContext specific values, such as session variables. In the example below, I am setting up the HttpContext in the SetUp method of a unit test:

  
[SetUp]
public void SetUp()
{
	HttpContext.Current = MockHelpers.FakeHttpContext();
	HttpContext.Current.Session["SomeSessionVariable"] = 123;
}

Another more heavy solution to the above would be to use a factory to get at and create your session. I opted for the solution above as it meant not changing my application code to fit in with my unit tests.

Entity Framework, Technical

Seed data from SQL scripts using Entity Framework Migrations (EF 4.3 +)

Normally you would add seed data using native C#.

You can also execute arbitrary SQL statements. To do so, in your Seed method (which can be overriden from your Migration folder in your Configuration class), simply read the contents of any SQL files you want to execute, and tell the context to run them:

internal sealed class Configuration : DbMigrationsConfiguration<MyDbContext>
 {

protected override void Seed(MyDbContext context)
 {
  var baseDir = AppDomain.CurrentDomain.BaseDirectory.Replace("\\bin", string.Empty) + "\\Migrations";

  context.Database.ExecuteSqlCommand(File.ReadAllText(baseDir + "\\DataClear.sql"));
  context.Database.ExecuteSqlCommand(File.ReadAllText(baseDir + "\\Table1Insertssql"));
  context.Database.ExecuteSqlCommand(File.ReadAllText(baseDir + "\\Table2Inserts.sql"));
 }
}

The above code will execute DataClear.sql, Table1Inserts.sql and Table2Inserts.sql, which are all in the root of my migrations folder.

Don’t forget that you can generate your insert statements using management studio, or by using the Static Data Generator tool.