NewRelic Azure NuGet package update pain again

I previously posted about issues I found when updating to the latest Azure NuGet package published by NewRelic. Unfortunately the install PowerShell script for the latest package now has more issues than the previous version. Here are the issues I found and how to fix them.

  1. newrelic.cmd not updated

    I believe the issue here (unconfirmed) is that the file is not updated because it has been changed as part of the previous package installation. The fix is to uninstall the previous package first and then manually delete the newrelic.cmd file. The uninstall won’t remove the file for the same reason.
  2. The license key is not written to newrelic.cmd

    This is a bug in the newrelic.cmd file published with the package. The install script is expect the file to have the line SET LICENSE_KEY=REPLACE_WITH_LICENSE_KEY so that the installer can substitute the value. Unfortunately the cmd file in the package has an actual license key rather than the expected placeholder. This means someone else’s license key will be used rather than your own.
  3. Same as my previous experience, ServiceDefinition.csdef was not found. You need to manually update this file to contain the following XML (slightly different to the previous version).
<Startup>
  <Task commandLine="newrelic.cmd" executionContext="elevated" taskType="simple">
    <Environment>
      <Variable name="EMULATED">
        <RoleInstanceValue xpath="/RoleEnvironment/Deployment/@emulated" />
      </Variable>
      <Variable name="IsWorkerRole" value="false" />
    </Environment>
  </Task>
</Startup>

Note that this XML is for a WebRole only. The WorkerRole XML has additional elements and attributes that you will need to add according to the PowerShell script.

Supporting XML transformation on multiple configuration files

Visual Studio has a great feature for web.config files where XML transformations can be done based on the current configuration. This is typically actioned when the web application is published. Unfortunately the MSBuild scripts only cater for web.config. This is a problem when you start to break up your configuration into multiple files and link them back using the configSource attribute.

<system.diagnostics configSource="system.diagnostics.config" />

One of the great things about MSBuild is that you can change it if you don’t like it. The solution to this issue is to open the proj file and add the following at the end of the file.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

  <!-- Existing proj content is here -->

  <PropertyGroup>
    <TransformWebConfigCoreDependsOn>
      $(TransformWebConfigCoreDependsOn);
      TransformAdditionalConfigurationFiles;
    </TransformWebConfigCoreDependsOn>
  </PropertyGroup>
  <UsingTask TaskName="GetTransformFiles" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
    <ParameterGroup>
      <ProjectRoot ParameterType="System.String" Required="true" />
      <BuildConfiguration ParameterType="System.String" Required="true" />
      <BaseNames ParameterType="System.String[]" Output="true" />
    </ParameterGroup>
    <Task>
      <Reference Include="System.IO" />
      <Using Namespace="System.IO" />
      <Code Type="Fragment" Language="cs"><![CDATA[
 
    var configSuffix = "." + BuildConfiguration + ".config";
    var matchingConfigurations = Directory.GetFiles(ProjectRoot, "*" + configSuffix);
    var files = new List<String>();
        
    foreach (String item in matchingConfigurations)
    {
        var filename = Path.GetFileName(item);
        var suffixIndex = filename.IndexOf(configSuffix);
        var prefix = filename.Substring(0, suffixIndex);
            
        if (prefix.Equals("Web", StringComparison.OrdinalIgnoreCase))
        {
            continue;
        }
        
        var targetFile = prefix + ".config";
        var targetPath = Path.Combine(ProjectRoot, targetFile);
        var targetInfo = new FileInfo(targetPath);
        
        if (targetInfo.Exists == false)
        {
            continue;
        }
        
        if (targetInfo.IsReadOnly)
        {
            targetInfo.IsReadOnly = false;
        }
            
        files.Add(prefix);
    }
 
    BaseNames = files.ToArray();
      ]]></Code>
    </Task>
  </UsingTask>
  <UsingTask TaskName="TransformXml" AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v11.0\Web\Microsoft.Web.Publishing.Tasks.dll" />
  <Target Name="TransformAdditionalConfigurationFiles">
    <Message Text="Searching for configuration transforms in $(ProjectDir)." />
    <GetTransformFiles ProjectRoot="$(ProjectDir)" BuildConfiguration="$(Configuration)">
      <Output TaskParameter="BaseNames" ItemName="BaseNames" />
    </GetTransformFiles>
    <TransformXml Source="%(BaseNames.Identity).config" Transform="%(BaseNames.Identity).$(Configuration).config" Destination="%(BaseNames.Identity).config" Condition="'@(BaseNames)' != ''" />
    <Message importance="high" Text="Transformed %(BaseNames.Identity).config using %(BaseNames.Identity).$(Configuration).config." Condition="'@(BaseNames)' != ''" />
  </Target>
</Project>

This script hooks into the process that runs the TransformXml task on web.config and runs it on all other *.Configuration.config files that it can find.

How do Parallel methods behave on a single core machine?

I’ve been wondering about this for a long time. None of the reading that I’ve done has conclusively answered this question. It just so happens that I’ve been developing in a VM for the last couple of months. I got the chance tonight to downgrade the VM specs to a single core to run some tests. The results were very pleasing, yet completely unexpected.

My test code is very unscientific but achieves the objective.

using System;
using System.Collections.Concurrent;
using System.Diagnostics;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;

namespace ConsoleApplication1
{
    class Program
    {
        private static ConcurrentBag<int> _taskThreadIds =new ConcurrentBag<int>();
        private static ConcurrentBag<int> _parallelThreadIds =new ConcurrentBag<int>();

        static void Main(string[] args)
        {
            Console.WriteLine("Starting task array on {0}", Thread.CurrentThread.ManagedThreadId);
            Stopwatch watch = Stopwatch.StartNew();

            Task[] tasks = new Task[100];

            for (int i = 0; i < 100; i++)
            {
                tasks[i] = Task.Factory.StartNew(TaskAction, i);
            }

            Task.WaitAll(tasks);

            watch.Stop();

            OutputResults(_taskThreadIds, watch, "task array");

            Console.WriteLine("Starting parallel loop on {0}", Thread.CurrentThread.ManagedThreadId);
            watch = Stopwatch.StartNew();

            Parallel.For(0, 100, ParallelAction);

            watch.Stop();

            OutputResults(_parallelThreadIds, watch, "parallel");

            Console.WriteLine("Press key to close");
            Console.ReadKey();
        }

        private static void OutputResults(ConcurrentBag<int> ids, Stopwatch watch, string testType)
        {
            var allIds = ids.ToList();
            var uniqueIds = allIds.Distinct().ToList();

            Console.WriteLine("Completed {0} on {1} in {2} millseconds using {3} threads", testType,
                              Thread.CurrentThread.ManagedThreadId, watch.ElapsedMilliseconds, uniqueIds.Count);

            for (int i = 0; i < uniqueIds.Count; i++)
            {
                Console.WriteLine("Thread {0} was used {1} times", uniqueIds[i], allIds.Count(x => x == uniqueIds[i]));
            }
        }

        private static void TaskAction(object x)
        {
            _taskThreadIds.Add(Thread.CurrentThread.ManagedThreadId);

            //Console.WriteLine("{0}: Starting on {1}", x, Thread.CurrentThread.ManagedThreadId);
            Thread.Sleep(500);
            //Console.WriteLine("{0}: Completing on {1}", x, Thread.CurrentThread.ManagedThreadId);
        }

        private static void ParallelAction(int x)
        {
            _parallelThreadIds.Add(Thread.CurrentThread.ManagedThreadId);

            //Console.WriteLine("{0}: Starting on {1}", x, Thread.CurrentThread.ManagedThreadId);
            Thread.Sleep(500);
            //Console.WriteLine("{0}: Completing on {1}", x, Thread.CurrentThread.ManagedThreadId);
        }
    }
}

And here is the answer.

image

The obvious observation is that Task.WaitAll on an array of tasks is almost three times slower. It uses a lower, but similar number of threads for the execution compared to Parallel. The most interesting outcome is the distribution of thread usage. This is where the real performance kicks in.

I was honestly expecting Parallel to not execute asynchronously in a single core environment as the documentation seems to be completely geared towards multi-core systems. I am really happy with this outcome, especially because the code flow makes better use of generics.

Bridging the gap between Font Awesome, Twitter Bootstrap, MVC and Nuget

I’m working on an MVC project which pulls in lots of Nuget packages. Font Awesome was the latest package but there is a discrepancy in the locations for the font files between the css url references and the location that the Nuget package uses for the MVC project. The css files use a reference to font/ relative to the current resource. The Nuget package puts all the resources into Content/font/.

I don’t want to change either the css file from the Nuget package or the location of the fonts used by the Nuget package. Doing so will just cause upgrade pain when a new version of Font Awesome gets released to Nuget.

I looked at custom routing options but these just seemed to cause more problems than they solved. It then dawned on me that I could use an IBundleBuilder implementation now that I know how they work.

This is what I have for my current Font Awesome StyleBundle configuration.

bundles.Add(
    new StyleBundle("~/css/fontawesome")
        .Include("~/Content/font-awesome.css"));

The contents of the css file need to be adjusted when the bundle is created so that they include the Content directory, but also make the resource reference relative to the application root. Enter the ReplaceContentsBundleBuilder.

namespace MyNamespace
{
    using System.Collections.Generic;
    using System.IO;
    using System.Web.Optimization;
    using Seterlund.CodeGuard;

    public class ReplaceContentsBundleBuilder : IBundleBuilder
    {
        private readonly string _find;

        private readonly string _replaceWith;

        private readonly IBundleBuilder _builder;

        public ReplaceContentsBundleBuilder(string find, string replaceWith)
            : this(find, replaceWith, new DefaultBundleBuilder())
        {
        }

        public ReplaceContentsBundleBuilder(string find, string replaceWith, IBundleBuilder builder)
        {
            Guard.That(() => find).IsNotNullOrEmpty();
            Guard.That(() => replaceWith).IsNotNullOrEmpty();
            Guard.That(() => builder).IsNotNull();

            _find = find;
            _replaceWith = replaceWith;
            _builder = builder;
        }

        public string BuildBundleContent(Bundle bundle, BundleContext context, IEnumerable<FileInfo> files)
        {
            string contents = _builder.BuildBundleContent(bundle, context, files);

            return contents.Replace(_find, _replaceWith);
        }
    }
}

This class makes it super easy to modify the bundle of the fly to give me the translation that I require without having to tweek anything coming from Nuget. The bundle config is now:

bundles.Add(
    new StyleBundle("~/css/fontawesome")
    {
        Builder = new ReplaceContentsBundleBuilder("url('font/", "url('/Content/font/")
    }.Include("~/Content/font-awesome.css"));
This will replace any content in the bundle that has a relative reference to font/ with a reference to /Content/font/ which is relative to the site root.

Easy done.

MVC bundling and line comments at the end of files

Recently the bundling and minification support in ASP.Net MVC4 have been causing grief with JavaScript's having unexpected tokens. The minification process is failing to process the bundle of scripts correctly, although it does kindly add a failure message to the top of the bundle output.

/* Minification failed. Returning unminified contents.
(5,2-3): run-time warning JS1195: Expected expression: *
(11,60-61): run-time warning JS1004: Expected ';': {
(395,2-3): run-time warning JS1195: Expected expression: )
(397,21-22): run-time warning JS1004: Expected ';': {
(397,4590-4591): run-time warning JS1195: Expected expression: )
(398,28-29): run-time warning JS1195: Expected expression: )
(398,84-85): run-time warning JS1002: Syntax error: }
(402,44-45): run-time warning JS1195: Expected expression: )
(408,1-2): run-time warning JS1002: Syntax error: }
(393,5-22): run-time warning JS1018: 'return' statement outside of function: return Modernizr;
(404,5,406,16): run-time warning JS1018: 'return' statement outside of function: return !!('placeholder' in (Modernizr.input || document.createElement('input')) &&
               'placeholder' in (Modernizr.textarea || document.createElement('textarea'))
             );
 */

The issues have been found when bundling the jQuery set of files that end with a //@ sourceMappingURL=… inline comment. The StackOverflow post here explains why this is happening. The short story is that the contents of each file is directly appended onto the previous file. If the previous file ended in an inline comment, then the following file is also partially commented out.

The post suggests that you can either remove the line comment from all of the js files or change the line comment to a block comment. I don’t like either of these solutions because many of the scripts are sourced from Nuget and these solutions would cause upgrade pain. We can solve this problem by using some custom bundling logic.

namespace MyNamespace
{
    using System.Web.Optimization;

    public class NewLineScriptBundle : ScriptBundle
    {
        public NewLineScriptBundle(string virtualPath) : base(virtualPath)
        {
            Builder = new NewLineBundleBuilder();
        }

        public NewLineScriptBundle(string virtualPath, string cdnPath) : base(virtualPath, cdnPath)
        {
            Builder = new NewLineBundleBuilder();
        }
    }
}

This is the class that you should use instead of ScriptBundle. It simply uses a custom bundle builder.

namespace MyNamespace
{
    using System.Collections.Generic;
    using System.IO;
    using System.Text;
    using System.Web.Optimization;
    using Microsoft.Ajax.Utilities;

    public class NewLineBundleBuilder : IBundleBuilder
    {
        public string BuildBundleContent(Bundle bundle, BundleContext context, IEnumerable<FileInfo> files)
        {
            var content = new StringBuilder();

            foreach (var fileInfo in files)
            {
                var contents = Read(fileInfo);
                var parser = new JSParser(contents);

                var bundleValue = parser.Parse(parser.Settings).ToCode();

                content.Append(bundleValue);
                content.AppendLine(";");
            }

            return content.ToString();
        }

        private virtual string Read(FileInfo file)
        {
            using (var reader = file.OpenText())
            {
                return reader.ReadToEnd();
            }
        }
    }
}

This custom bundle class reads the script files and separates them with a ; and a new line. This then allows the minification engine to correctly process the bundle because any line comment on the end of a file will no longer affect the script in the next file of the bundle.

Beware of returning IEnumerable in a Web Api action

I have been hitting an issue with MVC4 Web Api where my global error handling filters were not executed. The only advice out there is that they will not execute if the action throws an HttpResponseException.

I have finally figured out that returning a lazy IEnumerable instance will also cause the global error handler to not execute. In fact, it won’t cause controller or action level exception filters to execute either.

Consider the following:

namespace MyService
{
    using System;
    using System.Collections.Generic;
    using System.Collections.ObjectModel;
    using System.Linq;
    using System.Net;
    using System.Net.Http;
    using System.Web.Http;
    using System.Web.Http.ModelBinding;

    public class TestController : ApiController
    {
        [HttpGet]
        public HttpResponseMessage MyAction()
        {
            var data = BuildData();

            return Request.CreateResponse(HttpStatusCode.OK, data);
        }

        private static IEnumerable<int> BuildData()
        {
            yield return 1;

            throw new InvalidOperationException();
        }
    }
}

This will write information using a registered ITraceWriter implementation (which has its problems) but will not fire the exception filter attributes. The reason is that the part of the pipeline that evaluates the data to send in the response (therefore forcing the enumeration and hitting an exception)would presumably be beyond the part of the pipeline that is covered by the error handling.

The fix is to put a .ToList() call on the IEnumerable before providing the data to Request.CreateResponse. This will force the enumeration to execute within the controller and the exception from the enumeration to be thrown. This will then be handled by the exception filter attributes.

Using RestSharp to pass a set of variables to an ASP.Net MVC Web API action

Looks like a lot of people hit this issue and come up with lots of “interesting” solutions to get this to work. The answer is surprisingly simple however.

Assume that the controller action is like the following:

[HttpGet]
public HttpResponseMessage MyAction(
    [ModelBinder]List<string> referenceNames, DateTime startDate, DateTime endDate)
{
}

How do you get RestSharp to send the set of strings to the action so that they are deserialized correctly? The answer is like this.

var client = new RestClient(Config.ServiceAddress);
var request = new RestRequest(ActionLocation, Method.GET);

request.RequestFormat = DataFormat.Json;
request.AddParameter("ReferenceNames", "NameA");
request.AddParameter("ReferenceNames", "NameB");
request.AddParameter("StartDate", DateTime.Now.AddYears(-2).ToString("D"));
request.AddParameter("EndDate", DateTime.Now.AddYears(2).ToString("D"));

There are two things that make this work:

  1. Attribute the action parameter with [ModelBinder] – see here for more info
  2. Make multiple calls to request.AddParameter for the same parameter name

Do this and you should be good to go.

Dynamic realm discovery for federated authentication

I have a web role (RP) running in Windows Azure that uses ACS 2.0 as the identity provider (IP). The web role is configured with a certificate to work with the authentication negotiation and subsequent security session. The certificate supports both domain.com and www.domain.com. The issue is that the federation authentication configuration of the web role can only specify one realm and the realm attribute is a required value.

<wsFederation passiveRedirectEnabled="true" issuer="http://[addressOfAcs]" realm="http://www.domain.com" requireHttps="true" />

This works great if the user is browsing on www.domain.com and then goes through the authentication process. A security token will be issued for www.domain.com to which the user is redirected back to. The RP will then validate that the token was issued to the configured audience uri. These thankfully allow multiple addresses to be specified.

<audienceUris>
  <add value="http://www.domain.com" />
  <add value="http://domain.com" />
</audienceUris>

The problem is when you want to use the www-less address or any other host header for that matter. In this case, the user is browsing domain.com and goes through the authentication process. The token will be issued for www.domain.com but the user is then redirected back to the original address under the domain.com location. An exception is then thrown at this point.

Error message

System.IdentityModel.Services.FederationException: ID3206: A SignInResponse message may only redirect within the current web application: 'http://domain.com/' is not allowed.

Stack trace

at System.IdentityModel.Services.WSFederationAuthenticationModule.SignInWithResponseMessage(HttpRequestBase\ request)
at System.IdentityModel.Services.WSFederationAuthenticationModule.OnAuthenticateRequest(Object\ sender, EventArgs args)
at System.Web.HttpApplication.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

The fix here is to put together some dynamic realm discovery logic. Creating a custom WSFederationAuthenticationModule class provides some good hooks into the authentication process. We want to override OnRedirectingToIdentityProvider and set the realm as per the location of the current request.

For example:

namespace MyApplication.Web.Security
{
    using System;
    using System.IdentityModel.Services;
    using System.Security.Principal;
    using System.Web;
    using Seterlund.CodeGuard;

    public class DynamicRealmFederationAuthenticationModule : WSFederationAuthenticationModule
    {
        protected override void OnRedirectingToIdentityProvider(RedirectingToIdentityProviderEventArgs e)
        {
            e.SignInRequestMessage.Realm = DetermineDynamicRealm();

            base.OnRedirectingToIdentityProvider(e);
        }

        private static Uri BuildRequestedAddress(HttpRequest request)
        {
            Guard.That(() => request).IsNotNull();
            Guard.That(() => request.Headers).IsNotNull();
            Guard.That(() => request.Url).IsNotNull();

            var originalRequest = request.Url;
            var serverName = request.Headers["Host"];
            var address = string.Concat(originalRequest.Scheme, "://", serverName);

            address += originalRequest.PathAndQuery;

            return new Uri(address);
        }

        private string DetermineDynamicRealm()
        {
            // Set the realm to be the current domain name
            string realm;
            const string SecureHttp = "https://";
            var hostUri = BuildRequestedAddress(HttpContext.Current.Request);
            var port = string.Empty;

            if (Realm.StartsWith(SecureHttp, StringComparison.OrdinalIgnoreCase))
            {
                realm = SecureHttp;

                if (hostUri.Port != 443)
                {
                    port = ":" + hostUri.Port;
                }
            }
            else
            {
                realm = "http://";

                if (hostUri.Port != 80)
                {
                    port = ":" + hostUri.Port;
                }
            }

            realm += hostUri.Host + port;

            return realm;
        }
    }
}

A couple of things to note about this code.

Firstly, the DetermineDynamicRealm method could have been streamlined, but I also needed it to cater for my location development environment. The local environment uses the Azure SDK emulator for the web role and a custom website project as a development STS. Both the web role and the development STS use straight http as well as custom ports. The DetermineDynamicRealm method logic caters for both local development and production deployment.

Secondly, the BuildRequestedAddress method uses request headers to figure out the port of the request from the browser. This is because the web role sits behind a load balancer that does some trickery with ports. We need to look at the headers of the request to determine the port as the browser sees it rather than how the request on the current HttpContext sees it.

Next up, the web.config needs to be updated to use this module rather than the module that comes out of the box.

<add name="WSFederationAuthenticationModule" type="MyApplication.Web.Security.DynamicRealmFederationAuthenticationModule, MyApplication.Web" preCondition="managedHandler" />

The only remaining step is to set up ACS with an additional RP configuration for domain.com. You should now be able to authenticate and be redirected back to either domain.com or www.domain.com.

Fixing New Relic Nuget package for Azure

I tried to install the New Relic Nuget package for an Azure solution. Unfortunately the Nuget install script failed to update the ServiceDefinition.csdef file and provided the following error.

Unable to find the ServiceDefinition.csdef file in your solution, please make sure your solution contains an Azure deployment project and try again.

One of the good things about Nuget is that the install scripts are easily available to the solution.

The install script was trying to add the following xml fragment to the start of the web role definition in ServiceDefinition.csdef file.

<Startup>
  <Task commandLine="newrelic.cmd" executionContext="elevated" taskType="simple">
    <Environment>
      <Variable name="EMULATED">
        <RoleInstanceValue xpath="/RoleEnvironment/Deployment/@emulated"/>
      </Variable>
    </Environment>
  </Task> 
</Startup>

Adding this manually and we are good to go.

Causing a VS Web Project to fail compilation when content is missing

Sometimes things go wrong. For example, when files are not on your local disk when you expect them to be. I see this pop up every now and then when working in a team environment or even as a single develop across multiple machines. Usually it is because something has not be submitted into source control. Maybe the file was never bound to source control in the first place. I have seen [VS] simply miss files as well.

Normally this is not a problem. The compiler will complain if a C# class file is missing because it can’t do its job. Unfortunately a Web project does not throw a compiler error when a content file is missing. Thankfully there is an easy fix.

Take a new Web project for example.

image

We can simulate this issue by simply going down into the file system and renaming Site.css to another name.

image

The project will now identify the missing file in Solution Explorer if you refresh the window.

image

Ok, but what about a build.

1>------ Rebuild All started: Project: MvcApplication1, Configuration: Debug Any CPU ------
1>  MvcApplication1 -> c:\Dev\MvcApplication1\bin\MvcApplication1.dll
========== Rebuild All: 1 succeeded, 0 failed, 0 skipped ==========

Build Summary
-------------
00:00.883 - Success - Debug Any CPU - MvcApplication1\MvcApplication1.csproj

Total build time: 00:00.893

========== Rebuild All: 1 succeeded or up-to-date, 0 failed, 0 skipped ==========

I see this as a problem. You can check this in and every other developer/machine will be essentially broken. Worse still, any TFS gated build will pass as well. There is therefore no way out of the box to ensure that you do not corrupt your code line.

The trick here is to modify the project file. Edit the project file and you will find the following at the bottom:

<!-- To modify your build process, add your task inside one of the targets below and uncomment it. 
     Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target> -->

We want to remove the comments surrounding BeforeBuild and change it to the following:

<!-- To modify your build process, add your task inside one of the targets below and uncomment it. 
     Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="AfterBuild">
</Target> -->
<Target Name="BeforeBuild">
  <ItemGroup>
    <MissingContentFiles Include="@(Content)" Condition="!Exists(%(Content.FullPath))" />
  </ItemGroup>
  <Message Text="Item: %(Content.FullPath)" />
  <Message Text="Missing: %(MissingContentFiles.FullPath)" />
  <Error Text="Content file '%(MissingContentFiles.FullPath)' was not found." Condition="'@(MissingContentFiles)' != ''" ContinueOnError="true"  />
  <Error Text="One or more content files are missing." Condition="'@(MissingContentFiles)' != ''" />
</Target>

This will output warnings for each file that is missing, then fail the compilation. This allows you to see multiple files that are missing before compilation is halted.

1>------ Build started: Project: MvcApplication1, Configuration: Debug Any CPU ------
1>C:\Dev\MvcApplication1\MvcApplication1.csproj(329,5): warning : Content file 'C:\Dev\MvcApplication1\Scripts\jquery.validate.min.js' was not found.
1>C:\Dev\MvcApplication1\MvcApplication1.csproj(329,5): warning : Content file 'C:\Dev\MvcApplication1\Content\Site.css' was not found.
1>C:\Dev\MvcApplication1\MvcApplication1.csproj(330,5): error : One or more content files are missing.
========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ==========

Build Summary
-------------
00:00.649 - Failed  - Debug Any CPU - MvcApplication1\MvcApplication1.csproj

Total build time: 00:00.910

========== Build: 0 succeeded or up-to-date, 1 failed, 0 skipped ==========

image

Job done.