Insight.Database with Unity integration

One of my favourite development packages to use over the last year has been a micro-ORM called Insight.Database written by Jon Wagner. Version 2.1 of this excellent package comes with automatic interface implementations so you don’t even have to write DAL classes anymore.

I haven’t been able to play with this feature until the last week. One of the roadblocks was that I use Unity as my IoC for dependency injection which does not natively support injecting these types of instances. What I need is to be able to configure a Unity container to return an auto-implemented interface in a constructor. I have created Unity extensions before for injecting ConnectionStringSettings, AppSettings and RealProxy implementations so creating one for Insight.Database should be simple. The RealProxy Unity extension was very similar to what I need as the logic is just about the same.

I want to support Unity configuration so we need to start with a ParameterValueElement class.

namespace MyApplication.Server.Unity
{
    using System;
    using System.Configuration;
    using Microsoft.Practices.Unity;
    using Microsoft.Practices.Unity.Configuration;
    using Seterlund.CodeGuard;

    public class InsightOrmParameterValueElement : ParameterValueElement
    {
        public const string AppSettingKeyAttributeName = "appSettingKey";

        public const string ElementName = "insightOrm";

        public const string IsReliableAttributeName = "isReliable";

        public override InjectionParameterValue GetInjectionParameterValue(
            IUnityContainer container, Type parameterType)
        {
            Guard.That(container, "container").IsNotNull();
            Guard.That(parameterType, "parameterType").IsNotNull();

            var connectionSettings = GetConnectionSettings();

            return new InsightOrmParameterValue(parameterType, connectionSettings, IsReliable);
        }

        protected virtual ConnectionStringSettings GetConnectionSettings()
        {
            // Use the existing azure settings parameter value element to obtain the configuration value
            var configElement = new AzureSettingsParameterValueElement
            {
                AppSettingKey = AppSettingKey
            };

            var connectionSettings =
                configElement.CreateValue(typeof(ConnectionStringSettings)) as ConnectionStringSettings;

            return connectionSettings;
        }

        [ConfigurationProperty(AppSettingKeyAttributeName, IsRequired = true)]
        public string AppSettingKey
        {
            get
            {
                return (string)base[AppSettingKeyAttributeName];
            }

            set
            {
                base[AppSettingKeyAttributeName] = value;
            }
        }

        [ConfigurationProperty(IsReliableAttributeName, IsRequired = false, DefaultValue = false)]
        public bool IsReliable
        {
            get
            {
                return (bool)base[IsReliableAttributeName];
            }

            set
            {
                base[IsReliableAttributeName] = value;
            }
        }
    }
}

This class leverages a variant of the aforementioned ConnectionStringSettings Unity extension which reads database connection information from Azure configuration. You can easily change this part of the class to read the connection string from wherever is appropriate in your system.

Next we need an InjectionParameterValue class that will actual create the instance when the type is resolved in/by Unity.

namespace MyApplication.Server.Unity
{
    using System;
    using System.Configuration;
    using System.Data;
    using System.Data.Common;
    using System.Reflection;
    using Insight.Database;
    using Insight.Database.Reliable;
    using Microsoft.Practices.ObjectBuilder2;
    using Microsoft.Practices.Unity;
    using Microsoft.Practices.Unity.ObjectBuilder;
    using Seterlund.CodeGuard;

    public class InsightOrmParameterValue : TypedInjectionValue
    {
        private readonly ConnectionStringSettings _connectionSettings;

        private readonly bool _isReliable;

        public InsightOrmParameterValue(
            Type parameterType, ConnectionStringSettings connectionSettings, bool isReliable) : base(parameterType)
        {
            Guard.That(parameterType, "parameterType").IsNotNull();
            Guard.That(connectionSettings, "connectionSettings").IsNotNull();

            _connectionSettings = connectionSettings;
            _isReliable = isReliable;
        }

        public override IDependencyResolverPolicy GetResolverPolicy(Type typeToBuild)
        {
            var instance = ResolveInstance(ParameterType);

            return new LiteralValueDependencyResolverPolicy(instance);
        }

        private object ResolveInstance(Type typeToBuild)
        {
            // Return the connection.As<ParameterType> value
            var parameterTypes = new[]
            {
                typeof(IDbConnection)
            };
            var genericMethod = typeof(DBConnectionExtensions).GetMethod(
                "As", BindingFlags.Public | BindingFlags.Static, null, parameterTypes, null);
            var method = genericMethod.MakeGenericMethod(typeToBuild);

            DbConnection connection;

            if (_isReliable)
            {
                connection = _connectionSettings.ReliableConnection();
            }
            else
            {
                connection = _connectionSettings.Connection();
            }

            var parameters = new object[]
            {
                connection
            };

            var dalInstance = method.Invoke(null, parameters);

            return dalInstance;
        }
    }
}

This class will either create a connection or a reliable connection using the extension methods available in Insight.Database. It then gets a reflected reference to the As<T> extension method that converts that connection into an auto-implemented interface. This is the instance that gets returned.

Unity needs to know about the custom extension so that it can support it in configuration. The element class needs to be registered with Unity via a SectionExtension.

namespace MyApplication.Server.Unity
{
    using Microsoft.Practices.Unity.Configuration;

    public class SectionExtensionInitiator : SectionExtension
    {
        public override void AddExtensions(SectionExtensionContext context)
        {
            if (context == null)
            {
                return;
            }

            context.AddElement<AzureSettingsParameterValueElement>(AzureSettingsParameterValueElement.ElementName);
            context.AddElement<InsightOrmParameterValueElement>(InsightOrmParameterValueElement.ElementName);
        }
    }
}

Lastly the configuration in Unity needs a pointer to the SectionExtension.

<?xml version="1.0"?>
<unity>

  <sectionExtension type="MyApplication.Server.Unity.SectionExtensionInitiator, MyApplication.Server.Unity" />

  <!-- Unity configuration here -->

</unity>

The only thing left to do is use the Unity configuration to inject an auto-implemented interface instance.

<register type="MyApplication.Server.BusinessContracts.IAccountManager, MyApplication.Server.BusinessContracts"
          mapTo="MyApplication.Server.Business.AccountManager, MyApplication.Server.Business">
  <constructor>
    <param name="store">
      <dependency />
    </param>
    <param name="verificationStore">
      <insightOrm appSettingKey="MyDatabaseConnectionConfigurationKey" isReliable="true" />
    </param>
  </constructor>
</register>

And we are done.

Custom MVC4 bundling for timezoneJS tz data

I am using the timezoneJS to provide time zone support for JavaScript running in an MVC4 website that is hosted in Azure. The JavaScript library is bundled with 38 zone files under a tz directory. My understanding is the library reads these zone files in order to correctly determine time zone offsets for combinations of geographical locations and dates.

The library downloads zone files from the web server as it requires. The big thing to note about the zone files are that they are really heavily commented using a # character as the comment marker. The 38 zone files in the package amount to 673kb of data with the file “northamerica” being the largest at 137kb. This drops down to 232kb and 36kb respectively if comments and blank lines are striped. That’s a lot of unnecessary bandwidth being consumed. MVC4 does not understand these files so none of the OOTB bundling strip the comments. The bundling support in MVC4 (via the Microsoft.Web.Optimization package) will however allow us to strip this down to the bare data with a custom IBundleBuilder (my third custom bundler – see here and here for the others).

Current Implementation

For background, this is my current implementation. The web project structure looks like this.

image

The project was already bundling the zone files using the following logic.

private static void BundleTimeZoneData(BundleCollection bundles, HttpServerUtility server)
{
    var directory = server.MapPath("~/Scripts/tz");

    if (Directory.Exists(directory) == false)
    {
        var message = string.Format(
            CultureInfo.InvariantCulture, "The directory '{0}' does not exist.", directory);

        throw new DirectoryNotFoundException(message);
    }

    var files = Directory.GetFiles(directory);

    foreach (var file in files)
    {
        var fileName = Path.GetFileName(file);

        bundles.Add(new Bundle("~/script/tz/" + fileName).Include("~/Scripts/tz/" + fileName));
    }
}

The timezoneJS package is configured so that it correctly references the bundle paths.

timezoneJS.timezone.zoneFileBasePath = "/script/tz";
timezoneJS.timezone.defaultZoneFile = [];
timezoneJS.timezone.init({ async: false });

Custom Bundle Builder

Now comes the part were we strip out the unnecessary comments from the zone files. The TimeZoneBundleBuilder class simply strips out blank lines, lines that start with comments and the parts of lines that end in comments.

public class TimeZoneBundleBuilder : IBundleBuilder
{
    private readonly IBundleBuilder _builder;

    public TimeZoneBundleBuilder() : this(new DefaultBundleBuilder())
    {
    }

    public TimeZoneBundleBuilder(IBundleBuilder builder)
    {
        Guard.That(() => builder).IsNotNull();

        _builder = builder;
    }

    public string BuildBundleContent(Bundle bundle, BundleContext context, IEnumerable<BundleFile> files)
    {
        var contents = _builder.BuildBundleContent(bundle, context, files);

        // The compression of the data files is down to ~30% of the original size
        var builder = new StringBuilder(contents.Length / 3);
        var lines = contents.Split(
            new[]
            {
                Environment.NewLine
            }, 
            StringSplitOptions.RemoveEmptyEntries);

        for (var index = 0; index < lines.Length; index++)
        {
            var line = lines[index];

            if (string.IsNullOrWhiteSpace(line))
            {
                continue;
            }

            if (line.Trim().StartsWith("#", StringComparison.OrdinalIgnoreCase))
            {
                continue;
            }

            var hashIndex = line.IndexOf("#", StringComparison.OrdinalIgnoreCase);

            if (hashIndex == -1)
            {
                builder.AppendLine(line);
            }
            else
            {
                var partialLine = line.Substring(0, hashIndex);

                builder.AppendLine(partialLine);
            }
        }

        return builder.ToString();
    }
}

This is then hooked up in the bundle configuration for the tz files.

bundles.Add(
    new Bundle("~/script/tz/" + fileName)
    {
        Builder = new TimeZoneBundleBuilder()
    }.Include("~/Scripts/tz/" + fileName));

Fiddler confirms that the bundler is stripping the comments. The good news here is that gzip compression also comes into play. Now the gzip compressed “northamerica” file is down from 58kb to 9kb over the wire.

image

One of the key points to take away from this that you need to know what your application is serving. That includes the output you have written, but also the external packages you have included in your system.

Preventing passive federation for Web Api under a MVC4 website

I have an ASP.Net MVC4 website that is running passive federation to Azure ACS. This works great for standard http requests from a browser. I have now added some web api services to the same solution but hit issues with acceptance testing.

I have secured my api action using the following.

[Authorize(Roles = Role.Administrator)]
public class AdminReportController : ApiController
{
}

This achieves the objective but the acceptance test that verifies the security of the controller fails. It fails because it gets a 302 response on the unauthenticated call rather than the expected 401. If the http client running the REST request follows the 302, it will ultimately end up with a 200 response from the ACS authentication form. The overall outcome is achieved because the request denied the anonymous user, but the status code returned to the client and the body content does not reflect this.

The only way to get around this seems to be to hijack the WSFederationAuthenticationModule to tell it not to run passive redirection if the request is for the web api.

public class WebApiSafeFederationAuthenticationModule : WSFederationAuthenticationModule
{
    protected override void OnAuthorizationFailed(AuthorizationFailedEventArgs e)
    {
        Guard.That(() => e).IsNotNull();

        base.OnAuthorizationFailed(e);

        if (e.RedirectToIdentityProvider == false)
        {
            return;
        }

        string requestedUrl = HttpContext.Current.Request.Url.ToString();

        if (requestedUrl.IndexOf("/api/", StringComparison.OrdinalIgnoreCase) > -1)
        {
            // We don't want web api requests to redirect to the STS
            e.RedirectToIdentityProvider = false;

            return;
        }
    }
}

This now allows web api to return its default pipeline for unauthorised requests.

I put a shout out on Twitter regarding this issue to which Brock Allen quickly confirmed my solution.

Comments now on Disqus

I’ve suspected for years that the comment support on my blog (BlogEngine.Net) have been faulty. Unfortunately they have always worked for me when I have tested them as an anonymous user as well as an authenticated user. I have finally migrated comment support over the Disqus with recent users mentioning they have commented but the comments were never saved.

Hopefully the conversation on blog entries will start up again.

NewRelic Azure NuGet package update pain again

I previously posted about issues I found when updating to the latest Azure NuGet package published by NewRelic. Unfortunately the install PowerShell script for the latest package now has more issues than the previous version. Here are the issues I found and how to fix them.

  1. newrelic.cmd not updated

    I believe the issue here (unconfirmed) is that the file is not updated because it has been changed as part of the previous package installation. The fix is to uninstall the previous package first and then manually delete the newrelic.cmd file. The uninstall won’t remove the file for the same reason.
  2. The license key is not written to newrelic.cmd

    This is a bug in the newrelic.cmd file published with the package. The install script is expect the file to have the line SET LICENSE_KEY=REPLACE_WITH_LICENSE_KEY so that the installer can substitute the value. Unfortunately the cmd file in the package has an actual license key rather than the expected placeholder. This means someone else’s license key will be used rather than your own.
  3. Same as my previous experience, ServiceDefinition.csdef was not found. You need to manually update this file to contain the following XML (slightly different to the previous version).
<Startup>
  <Task commandLine="newrelic.cmd" executionContext="elevated" taskType="simple">
    <Environment>
      <Variable name="EMULATED">
        <RoleInstanceValue xpath="/RoleEnvironment/Deployment/@emulated" />
      </Variable>
      <Variable name="IsWorkerRole" value="false" />
    </Environment>
  </Task>
</Startup>

Note that this XML is for a WebRole only. The WorkerRole XML has additional elements and attributes that you will need to add according to the PowerShell script.

Supporting XML transformation on multiple configuration files

Visual Studio has a great feature for web.config files where XML transformations can be done based on the current configuration. This is typically actioned when the web application is published. Unfortunately the MSBuild scripts only cater for web.config. This is a problem when you start to break up your configuration into multiple files and link them back using the configSource attribute.

<system.diagnostics configSource="system.diagnostics.config" />

One of the great things about MSBuild is that you can change it if you don’t like it. The solution to this issue is to open the proj file and add the following at the end of the file.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

  <!-- Existing proj content is here -->

  <PropertyGroup>
    <TransformWebConfigCoreDependsOn>
      $(TransformWebConfigCoreDependsOn);
      TransformAdditionalConfigurationFiles;
    </TransformWebConfigCoreDependsOn>
  </PropertyGroup>
  <UsingTask TaskName="GetTransformFiles" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
    <ParameterGroup>
      <ProjectRoot ParameterType="System.String" Required="true" />
      <BuildConfiguration ParameterType="System.String" Required="true" />
      <BaseNames ParameterType="System.String[]" Output="true" />
    </ParameterGroup>
    <Task>
      <Reference Include="System.IO" />
      <Using Namespace="System.IO" />
      <Code Type="Fragment" Language="cs"><![CDATA[
 
    var configSuffix = "." + BuildConfiguration + ".config";
    var matchingConfigurations = Directory.GetFiles(ProjectRoot, "*" + configSuffix);
    var files = new List<String>();
        
    foreach (String item in matchingConfigurations)
    {
        var filename = Path.GetFileName(item);
        var suffixIndex = filename.IndexOf(configSuffix);
        var prefix = filename.Substring(0, suffixIndex);
            
        if (prefix.Equals("Web", StringComparison.OrdinalIgnoreCase))
        {
            continue;
        }
        
        var targetFile = prefix + ".config";
        var targetPath = Path.Combine(ProjectRoot, targetFile);
        var targetInfo = new FileInfo(targetPath);
        
        if (targetInfo.Exists == false)
        {
            continue;
        }
        
        if (targetInfo.IsReadOnly)
        {
            targetInfo.IsReadOnly = false;
        }
            
        files.Add(prefix);
    }
 
    BaseNames = files.ToArray();
      ]]></Code>
    </Task>
  </UsingTask>
  <UsingTask TaskName="TransformXml" AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v11.0\Web\Microsoft.Web.Publishing.Tasks.dll" />
  <Target Name="TransformAdditionalConfigurationFiles">
    <Message Text="Searching for configuration transforms in $(ProjectDir)." />
    <GetTransformFiles ProjectRoot="$(ProjectDir)" BuildConfiguration="$(Configuration)">
      <Output TaskParameter="BaseNames" ItemName="BaseNames" />
    </GetTransformFiles>
    <TransformXml Source="%(BaseNames.Identity).config" Transform="%(BaseNames.Identity).$(Configuration).config" Destination="%(BaseNames.Identity).config" Condition="'@(BaseNames)' != ''" />
    <Message importance="high" Text="Transformed %(BaseNames.Identity).config using %(BaseNames.Identity).$(Configuration).config." Condition="'@(BaseNames)' != ''" />
  </Target>
</Project>

This script hooks into the process that runs the TransformXml task on web.config and runs it on all other *.Configuration.config files that it can find.

How do Parallel methods behave on a single core machine?

I’ve been wondering about this for a long time. None of the reading that I’ve done has conclusively answered this question. It just so happens that I’ve been developing in a VM for the last couple of months. I got the chance tonight to downgrade the VM specs to a single core to run some tests. The results were very pleasing, yet completely unexpected.

My test code is very unscientific but achieves the objective.

using System;
using System.Collections.Concurrent;
using System.Diagnostics;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;

namespace ConsoleApplication1
{
    class Program
    {
        private static ConcurrentBag<int> _taskThreadIds =new ConcurrentBag<int>();
        private static ConcurrentBag<int> _parallelThreadIds =new ConcurrentBag<int>();

        static void Main(string[] args)
        {
            Console.WriteLine("Starting task array on {0}", Thread.CurrentThread.ManagedThreadId);
            Stopwatch watch = Stopwatch.StartNew();

            Task[] tasks = new Task[100];

            for (int i = 0; i < 100; i++)
            {
                tasks[i] = Task.Factory.StartNew(TaskAction, i);
            }

            Task.WaitAll(tasks);

            watch.Stop();

            OutputResults(_taskThreadIds, watch, "task array");

            Console.WriteLine("Starting parallel loop on {0}", Thread.CurrentThread.ManagedThreadId);
            watch = Stopwatch.StartNew();

            Parallel.For(0, 100, ParallelAction);

            watch.Stop();

            OutputResults(_parallelThreadIds, watch, "parallel");

            Console.WriteLine("Press key to close");
            Console.ReadKey();
        }

        private static void OutputResults(ConcurrentBag<int> ids, Stopwatch watch, string testType)
        {
            var allIds = ids.ToList();
            var uniqueIds = allIds.Distinct().ToList();

            Console.WriteLine("Completed {0} on {1} in {2} millseconds using {3} threads", testType,
                              Thread.CurrentThread.ManagedThreadId, watch.ElapsedMilliseconds, uniqueIds.Count);

            for (int i = 0; i < uniqueIds.Count; i++)
            {
                Console.WriteLine("Thread {0} was used {1} times", uniqueIds[i], allIds.Count(x => x == uniqueIds[i]));
            }
        }

        private static void TaskAction(object x)
        {
            _taskThreadIds.Add(Thread.CurrentThread.ManagedThreadId);

            //Console.WriteLine("{0}: Starting on {1}", x, Thread.CurrentThread.ManagedThreadId);
            Thread.Sleep(500);
            //Console.WriteLine("{0}: Completing on {1}", x, Thread.CurrentThread.ManagedThreadId);
        }

        private static void ParallelAction(int x)
        {
            _parallelThreadIds.Add(Thread.CurrentThread.ManagedThreadId);

            //Console.WriteLine("{0}: Starting on {1}", x, Thread.CurrentThread.ManagedThreadId);
            Thread.Sleep(500);
            //Console.WriteLine("{0}: Completing on {1}", x, Thread.CurrentThread.ManagedThreadId);
        }
    }
}

And here is the answer.

image

The obvious observation is that Task.WaitAll on an array of tasks is almost three times slower. It uses a lower, but similar number of threads for the execution compared to Parallel. The most interesting outcome is the distribution of thread usage. This is where the real performance kicks in.

I was honestly expecting Parallel to not execute asynchronously in a single core environment as the documentation seems to be completely geared towards multi-core systems. I am really happy with this outcome, especially because the code flow makes better use of generics.

Bridging the gap between Font Awesome, Twitter Bootstrap, MVC and Nuget

I’m working on an MVC project which pulls in lots of Nuget packages. Font Awesome was the latest package but there is a discrepancy in the locations for the font files between the css url references and the location that the Nuget package uses for the MVC project. The css files use a reference to font/ relative to the current resource. The Nuget package puts all the resources into Content/font/.

I don’t want to change either the css file from the Nuget package or the location of the fonts used by the Nuget package. Doing so will just cause upgrade pain when a new version of Font Awesome gets released to Nuget.

I looked at custom routing options but these just seemed to cause more problems than they solved. It then dawned on me that I could use an IBundleBuilder implementation now that I know how they work.

This is what I have for my current Font Awesome StyleBundle configuration.

bundles.Add(
    new StyleBundle("~/css/fontawesome")
        .Include("~/Content/font-awesome.css"));

The contents of the css file need to be adjusted when the bundle is created so that they include the Content directory, but also make the resource reference relative to the application root. Enter the ReplaceContentsBundleBuilder.

namespace MyNamespace
{
    using System.Collections.Generic;
    using System.IO;
    using System.Web.Optimization;
    using Seterlund.CodeGuard;

    public class ReplaceContentsBundleBuilder : IBundleBuilder
    {
        private readonly string _find;

        private readonly string _replaceWith;

        private readonly IBundleBuilder _builder;

        public ReplaceContentsBundleBuilder(string find, string replaceWith)
            : this(find, replaceWith, new DefaultBundleBuilder())
        {
        }

        public ReplaceContentsBundleBuilder(string find, string replaceWith, IBundleBuilder builder)
        {
            Guard.That(() => find).IsNotNullOrEmpty();
            Guard.That(() => replaceWith).IsNotNullOrEmpty();
            Guard.That(() => builder).IsNotNull();

            _find = find;
            _replaceWith = replaceWith;
            _builder = builder;
        }

        public string BuildBundleContent(Bundle bundle, BundleContext context, IEnumerable<FileInfo> files)
        {
            string contents = _builder.BuildBundleContent(bundle, context, files);

            return contents.Replace(_find, _replaceWith);
        }
    }
}

This class makes it super easy to modify the bundle of the fly to give me the translation that I require without having to tweek anything coming from Nuget. The bundle config is now:

bundles.Add(
    new StyleBundle("~/css/fontawesome")
    {
        Builder = new ReplaceContentsBundleBuilder("url('font/", "url('/Content/font/")
    }.Include("~/Content/font-awesome.css"));
This will replace any content in the bundle that has a relative reference to font/ with a reference to /Content/font/ which is relative to the site root.

Easy done.

MVC bundling and line comments at the end of files

Recently the bundling and minification support in ASP.Net MVC4 have been causing grief with JavaScript's having unexpected tokens. The minification process is failing to process the bundle of scripts correctly, although it does kindly add a failure message to the top of the bundle output.

/* Minification failed. Returning unminified contents.
(5,2-3): run-time warning JS1195: Expected expression: *
(11,60-61): run-time warning JS1004: Expected ';': {
(395,2-3): run-time warning JS1195: Expected expression: )
(397,21-22): run-time warning JS1004: Expected ';': {
(397,4590-4591): run-time warning JS1195: Expected expression: )
(398,28-29): run-time warning JS1195: Expected expression: )
(398,84-85): run-time warning JS1002: Syntax error: }
(402,44-45): run-time warning JS1195: Expected expression: )
(408,1-2): run-time warning JS1002: Syntax error: }
(393,5-22): run-time warning JS1018: 'return' statement outside of function: return Modernizr;
(404,5,406,16): run-time warning JS1018: 'return' statement outside of function: return !!('placeholder' in (Modernizr.input || document.createElement('input')) &&
               'placeholder' in (Modernizr.textarea || document.createElement('textarea'))
             );
 */

The issues have been found when bundling the jQuery set of files that end with a //@ sourceMappingURL=… inline comment. The StackOverflow post here explains why this is happening. The short story is that the contents of each file is directly appended onto the previous file. If the previous file ended in an inline comment, then the following file is also partially commented out.

The post suggests that you can either remove the line comment from all of the js files or change the line comment to a block comment. I don’t like either of these solutions because many of the scripts are sourced from Nuget and these solutions would cause upgrade pain. We can solve this problem by using some custom bundling logic.

namespace MyNamespace
{
    using System.Web.Optimization;

    public class NewLineScriptBundle : ScriptBundle
    {
        public NewLineScriptBundle(string virtualPath) : base(virtualPath)
        {
            Builder = new NewLineBundleBuilder();
        }

        public NewLineScriptBundle(string virtualPath, string cdnPath) : base(virtualPath, cdnPath)
        {
            Builder = new NewLineBundleBuilder();
        }
    }
}

This is the class that you should use instead of ScriptBundle. It simply uses a custom bundle builder.

namespace MyNamespace
{
    using System.Collections.Generic;
    using System.IO;
    using System.Text;
    using System.Web.Optimization;
    using Microsoft.Ajax.Utilities;

    public class NewLineBundleBuilder : IBundleBuilder
    {
        public string BuildBundleContent(Bundle bundle, BundleContext context, IEnumerable<FileInfo> files)
        {
            var content = new StringBuilder();

            foreach (var fileInfo in files)
            {
                var contents = Read(fileInfo);
                var parser = new JSParser(contents);

                var bundleValue = parser.Parse(parser.Settings).ToCode();

                content.Append(bundleValue);
                content.AppendLine(";");
            }

            return content.ToString();
        }

        private virtual string Read(FileInfo file)
        {
            using (var reader = file.OpenText())
            {
                return reader.ReadToEnd();
            }
        }
    }
}

This custom bundle class reads the script files and separates them with a ; and a new line. This then allows the minification engine to correctly process the bundle because any line comment on the end of a file will no longer affect the script in the next file of the bundle.

Beware of returning IEnumerable in a Web Api action

I have been hitting an issue with MVC4 Web Api where my global error handling filters were not executed. The only advice out there is that they will not execute if the action throws an HttpResponseException.

I have finally figured out that returning a lazy IEnumerable instance will also cause the global error handler to not execute. In fact, it won’t cause controller or action level exception filters to execute either.

Consider the following:

namespace MyService
{
    using System;
    using System.Collections.Generic;
    using System.Collections.ObjectModel;
    using System.Linq;
    using System.Net;
    using System.Net.Http;
    using System.Web.Http;
    using System.Web.Http.ModelBinding;

    public class TestController : ApiController
    {
        [HttpGet]
        public HttpResponseMessage MyAction()
        {
            var data = BuildData();

            return Request.CreateResponse(HttpStatusCode.OK, data);
        }

        private static IEnumerable<int> BuildData()
        {
            yield return 1;

            throw new InvalidOperationException();
        }
    }
}

This will write information using a registered ITraceWriter implementation (which has its problems) but will not fire the exception filter attributes. The reason is that the part of the pipeline that evaluates the data to send in the response (therefore forcing the enumeration and hitting an exception)would presumably be beyond the part of the pipeline that is covered by the error handling.

The fix is to put a .ToList() call on the IEnumerable before providing the data to Request.CreateResponse. This will force the enumeration to execute within the controller and the exception from the enumeration to be thrown. This will then be handled by the exception filter attributes.