Tuesday, 23 October 2012

Getting all profile happy with JustTrace

Recently I have been developing a fair size application and so far everything seemed to be "just working" fine. However during testing I noticed a few times the app seemed to stall up. This left me quite unhappy and I knew it was time to relook at some of my "decisions". Initial glances through the code didn't really highlight anything so I thought it about time to profile the application and see what it can find.

In the past I have used Redgate's Ants Profiler and an early version of Telerik's JustTrace. I decided to give Telerik's JustTrace a go again as I have heard alot about it and wanted to see how it's changed / grown.

Alas, my initial attempt to use JustTrace ended quite abruptly, I've been developing on Windows 8 and VS2012 exclusively for about 5 months now and sadly JustTrace didn't support IIS8 or IIS Express 8. I was gutted to find this out but a quick chat to the Telerik Dev Team and Chris Eargle gave me hope as they advised me the Q3 update was right around the corner, 1 week to be exact, and it brought all the new toys into the game,

HURRAH!

So tonight I got it all updated and fired up, and started profiling my app. Profiling is immensely easy to perform, simply enable justtrace in the Telerik menu and hit F5 :) You are asked how you want to profile, i chose to simply [].

You then simply start using your application like normal and JustTrace sits there observing and recording what it is up too. Once you have spent some testing you can really start digging around. The first thing to notice is the Timeline view, this is showing all the CPU activity whilst running through the application. You want to start looking at where you have some peaks, and in particular extended CPU activity.
Timeline View
By clicking on an area of the timeline you get a larger view where you can highlight a time snapshot to view.
Show Snapshot

You can then look at a ton of options to find your issue, by looking at the call trees across all threads, all methods and even look straight into "hot spots". Hot spots are each threads expensive method in terms of time. In my current snapshot Assembly.Load was very expensive, but this was app startup, not what I was noticing.

Hot Spot View


By looking at the other peaks I did find where 3 seconds was going, executing queries against the database. Now this could be down to bad queries, but its more likely to be a bad use of Linq with Entity Framework, and not compiling queries etc. However now I know where to look and it only took me 5 - 10 minutes of profiling to find out. Fantastic, stuff and I've not even touched memory profiling !

My Hot Spot
Hopefully I can do before my trial runs out, but so far it looks worthwhile investing in, here's hoping I can get some BizSpark discount ;)

Wednesday, 10 October 2012

Deploying Non Project Files with Web Deploy

Deploying Websites and Web Applications has changed massively over the years, I have seen the likes of FTP, Copy & Paste over RDP (why!!) etc but there has always been a degree of, did I copy the correct files, did I miss any, did I upload the correct client config etc.

Since VS2010 I have been a big fan of always publishing my applications to either a local folder and then upload or directly to the server, depending on the setup.

One thing I have always struggled with is publishing files that aren't part of my project, and ensuring the correct client config settings are uploaded. Now I'm running VS2012 full time I decided it was time to put to bed these issues and ensure I could use one click, or as near to one as possible, to deploy stage configurations of my applications as well as live.

VS2012 has tweaked publishing again, it now includes a much fuller publish model, with WebDeploy amongst the usual FTP, FileSystem etc. The Package / Publish process is just an extension of MSBuild, which means you can perform any process / action that you would normally do with MSBuild, this could mean perform minimization / compression of css / js etc. Other people have written about this, http://sedodream.com/2011/02/25/HowToCompressCSSJavaScriptBeforePublishpackage.aspx

When you deploy MSBuild performs two steps, first of all it bundles all your content into a temporary location and then copies it all to the final location, or compiles a package if that's what you have configured

What I needed to do was alter the process to pull in some additional files into the temporary location so that they then get published to the final place.

This is possible by extending the PipelineCollectFilesPhaseDependsOn phase. We can simply add a build target to our file and then run this in the PipelineCollectFilesPhaseDependsOn phase.

Our Target:


<Target Name="CopyIOCFiles">
    <Message Text="Copy IOC Files" Importance="high"/>
    <ItemGroup>
      <_CustomFiles Include="$(ProjectDir)\IOC\**\*" />
      <FilesForPackagingFromProject  Include="%(_CustomFiles.Identity)">
        <DestinationRelativePath>bin\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
      </FilesForPackagingFromProject>
    </ItemGroup>
  </Target>

It's worth noting here that my target could just contain the ItemGroup element however I like to know the target has run so by sticking in a message element we know it's ran by looking at our build output,

This target is then run in the correct phase via the following:


 <PropertyGroup>
    <PipelineCollectFilesPhaseDependsOn>
      CopyIOCFiles;
      $(PipelineCollectFilesPhaseDependsOn);
    </PipelineCollectFilesPhaseDependsOn>
  </PropertyGroup>

Hurrah! That's it, actually a lot easier than I thought, when you combine this with web.config transformations on a per configuration basis you really can deploy one application in several configurations in one click.



Sunday, 10 June 2012

Shims Shim Shims

Tonight I started a small project and I thought I'd take the opportunity to explore the new Microsoft.QualityTools.Testing.Fakes library. In case you don't know this library basically allows you to "easily" mock out those pesky sealed classes within the .Net Framework, I've only used it to do System.IO tonight but it does do more. I have a feeling though System.IO will be used heavily with it.

What the library allows you to do is to generate a shim version of each sealed class within a dll. So what is a shim? MSDN at the moment says the following:
Shim types allow detouring of hard-coded dependencies on static or non-overridable methods.
What this means to you and I is you get to create a wrapper type, ShimDirectoryInfo for example, you then set delegates for each method and property that you are going to use on the type and let these return your expected values. When your real code then goes to use the original type, DirectoryInfo in our example, it will automatically use the shimmed methods you have defined. What's clever about this is you can not only pass your shims into code by getting the shims generated instance type but you can also declare an AllInstance implementation which any code that creates your type uses. This is neat as you don't have to then modify existing code to take this types.

A simple example that isn't playing with System.DateTime.Now ;) A lot of the examples on the net use DateTime.Now as an example of shims but I wanted to show how I started using them for testing code that relies on System.IO.

I have a simple class LocalStorage that has a constructor that takes a file path. all we do for now in this constructor is check for a null / empty path and whether the directory actually exists, and if it doesn't throw.

 /// <summary>
 /// Creates a new local storage class for accessing file and directory information
 /// </summary>
 /// <param name="rootPath">Base path to storage location</param>
 public LocalStorage(string rootPath)
{
  if(String.IsNullOrWhiteSpace(rootPath)){
  throw new ArgumentNullException("rootPath");
         }
         if (!System.IO.Directory.Exists(rootPath))
         {
                throw new IOException("Path does not exist: " + rootPath);
         }
         _rootPath = new DirectoryInfo(rootPath);
}
Nothing to complicated there but how would we test the directory not existing and returning an exception. Previously I would have done something like pass in a path that "should" never exist however this isn't a particularly elegant way of doing it. So how would shims help.

Here's my new test:

[Fact]
public void NonExistantPathThrowsIOException()
{
            using (ShimsContext.Create())
            {
                //use fakes to say the folder doesn't exist without having to actually hit the operating system
                System.IO.Fakes.ShimDirectory.ExistsString = (path) => false;
                Assert.Throws<System.IO.IOException>(() => new LocalStorage("m:\\test"));
            }
        }

To start using Shims after creating the shim types of an assembly, we new up a ShimsContext, ShimsContext.Create(), note how this needs to be in a using statement as it's disposable and for good reason. Shims exist for the lifetime of the appdomain, so if you forget to dispose of your context any shims you have created would be used for all tests until the appdomain was shutdown. Therefore by using a using statement and thus the dispose method you are creating a scope and context.

Now within our context we get to set a delegate against what we want faking. In this case we want to fake our call to Directory.Exists. As this is a static method we can set this globally. This is done by using the ExistsString delegate on the ShimDirectory class. We can use a lambda here to make this really short and sweet.

That's it! Now when we run our test our fake method is used when ever Directory.Exists is called, no more dependency on the real filesystem. Nice :)

Now you can take this quite far and start mocking out further things, GetFiles for example, I want to test that my business logic which trims out specific files that can't be handled by the default GetFiles method.

What you find with methods which take overloads is that when you go to set the delegate you have to ensure you set the correct one. GetFiles for example has the following delegate methods on ShimDirectoryInfo: GetFiles, GetFilesString, GetFilesStringSearchOption. One for each overload combination. It's worth ensure you check the method signature to ensure you get the correct delegate otherwise you'll end up chasing a bug which doesn't exist in the real code but in your test ;)

Finally not everything is implemented via shims, there is still stuff missing. I found that ShimFileInfo doesn't provide a way to fake File write times, but it does names etc. It is a step in the right direction though when dealing with "stubborn" code.

Enjoy



Thursday, 10 May 2012

Portable Class Library Projects and your Build Server

Yesterday my build server started failing on a particular project and after a quick look at the log I found that it simply wasn't building the solution. Peculiar as I knew it built fine on several development machines, however I dug into the logs and found the following:
  •  error MSB4019: The imported project "C:\Program Files (x86)\MSBuild\Microsoft\Portable\v4.0\Microsoft.Portable.CSharp.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.
Oh dear, now I know Portable Class Libraries is an add on so I went to MSDN Visual Studio Gallery and got the installer so that I could chuck it on to the build server.

However when attempting to install I got a great message saying I needed VS2010 and SP1 installed, obviously for a build server this wasn't going to happen. After a bit of digging on MSDN I found that the installer actually has a switch that when used just installs the targets for MSBuild.
To install the Portable Class Library Tools on a build machine without installing Visual Studio 2010, save the download file (PortableLibraryTools.exe) on your computer, and run the installation program from a Command Prompt window. Include the /buildmachine switch on the command line.

Simply run the installer with the /BuildMachine switch and everything installs hunky dory.

PortableLibraryTools.exe  /buildmachine


To my great relief the build server started building again and things went back to normal.

Thursday, 12 April 2012

Using the SQL Server Export generating an excel spreadsheet from an access database

What a wordy title, go read it again, it's crazy but true.

This post covers generating an Excel spreadsheet from an Access Database but using a SQL Server tool. It's crazy but great and easy!

Welcome!
I am assuming you have SQL Server installed and the import export wizard, if you don't have this go get it first before carrying on.

First up open the Import and Export Data application, note this has to be the x86 edition so if you are running an x64 machine, which I think most dev's are nowadays, ensure you choose the correct edition.
This will give you the welcome screen.

Source Screen
Click next to get the ball rolling, this first screen is where we want to get the data from, in this case access, so use the drop down box to select Microsoft Access. Then provide the path to the database file, you can see my file path was actually on the network and it worked fine.






Destination Screen

Next up the destination, well again Excel is what we want so highlight in the list and provide a path to save too. You can also choose the version of Excel you want the file format to be, lots of options which is great.

Now upon clicking Next here two things may happen, you might proceed to the data copy screen or your might get an error. On my machine I got an error however on others I did not. The error was: The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine. (System.Data)

Copy or Query
If you don't have office installed going to http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=13255 and downloading the 32bit Access Database Engine will fix this however I already has Office 2010 so this wasn't the issue. However if you remember earlier I mentioned about the difference between x86 and x64 well it's back to haunt us, I installed 64bit Office, at the time I thought why not I have 16GB ram, something needs to attempt to use it.... well this meant that only a 64bit provider was registered. I attempted to download the 32bit access database engine but it won't install if you have 64 bit office, which does make sense. So I rolled myself back to 32bit Office, sad but true.

Any how, with all that sorted you will be at the copy or query screen, if you simply want all your data out of your database in the form it is stored you could use the copy table option. However I wanted to pull data from several tables and output it into a specific format so I chose the query option.

What a query
You then get a nice big text box to enter your query, you could be hard core and enter this direct and ensure it parses etc but I'm guessing like most people you would have this query tested in Sql Management Studio and would simply paste it in or use the browse button to point it to a saved query.

It is worth noting here that I believe there is a bug with this window, if you browse and load a file I was finding that only half of my query was in the window, however if I copy pasted it worked fine.

The next window asks you to confirm the tables and views to copy, when you are using a custom query there isn't a lot more you can do on this screen so click next again.

This next screen, Run Package, gives you the option to run the export now or to save it as a package that can be reused or used later. However you can only do this on SQL Server Standard, Enterprise, Developer or Evaluation. If you have SQL Server Express, Web, or Workgroup you can only run it immediately.

Success!
When you choose to run the package you get a progress window showing what's happening, what's left etc, very similar to the SQL installation progress screens, however when it's done you should have all green ticks and an excel file with you access data in.

Cracking stuff, if you haven't before you should take some time looking at the SQL Import and Export tool and DTS more fully, it allows you to import and export data from a vast array of sources into just as many again. I have used it to take old Access DB's and create SQL ones and managing data migrations, it's a tool that should be in every developers belt. We don't need to continually write console or windows form apps for managing data we already have a tool that will do the majority of use cases, we just have to use it and write the queries!

I hope this helps, I know it saved me a ton of time :)