software development
Practical MVVM
Last Wednesday night, I attended a talk at RockNUG on MVVM by Joel Cochran. It’s the best explanation of the Model-View-ViewModel design pattern that I’ve seen so far. I found his talk particularly useful because he focused on the fundamentals of the design pattern instead of a specific framework (of which there are many).
Cochran’s talk was good for second and completely unexpected reason–his use of Mercurial for his code demos. I’ve been to my share of conferences and user groups and seen a lot of demos, but before that talk, I’d never seen a speaker eliminate the inevitable typos and otherwise speed up his presentation that way. When there was some code he wanted to show that exhibited an aspect of his Practical MVVM Manifesto, he simply grabbed a commit from his local Mercurial repository and updated the code in place. The next time I give a talk or do any demos, I hope I can make good use of that trick too.
.NET Reflector--No Longer Free, But Still Worth It
Those of us who registered with red-gate.com after they bought Reflector from its creator, Lutz Roeder, got an e-mail on February 2 saying the next version of Reflector would no longer be free. It’s the second time in my recent memory that a free and/or open source package for .NET became closed. The first one was NCover, which was probably the best option out there for determining how much of your codebase was covered by unit tests. Even at a price with subscription of $658, it may be a more cost-effective option than paying for the Visual Studio sku that includes team testing tools.
By contrast, the entry-level price for .NET Reflector is relatively low ($35). As a tool, I think it’s valuable enough to the every-day .NET developer to spend their own money for.
Introducing .NET Reflector
I gave a presentation on .NET Reflector at the January 12 meeting of RockNUG. I took most of my time demonstrating the product and answering questions, so I had very few slides. So instead of simply posting them here and calling it a day, this blog post will incorporate some of the Q & A that happened during the presentation.
What Is It?
The title slide in my presentation called .NET Reflector an “x-ray machine for (unobfuscated) .NET assemblies”. That little phrase actually understates what .NET Reflector can do. Decompiling .NET assemblies is only one of the three major things it does (and is probably the best-publicized of its capabilities). This tool also provides an excellent class browser and static analysis capabilities.
Why Is It Helpful?
In addition to giving developers to see code for third-party assemblies when they don’t have access to their source files, it can be quite a useful learning tool for seeing what the compiler generates on-the-fly for the source code we write ourselves.
Demos
One of the first features I demonstrated was the ability of .NET Reflector to decompile an assembly into multiple .NET languages. C# is the default, but VB.NET, Delphi, Oxygene, MC++, F#, and MSIL are the other target languages available out of the box. Using the add-in architecture of .NET Reflector, one developer added limited support for decompilation to PowerShell (download the PowerShellLanguage.zip file).
Around this point in the presentation, someone asked if you could cut-and-paste decompiled source into a file. Not only does it work, but with Denis Bauer’s Reflector FileDissassembler plug-in installed, you can decompile an entire assembly into its constituent source code files (though I suspect that Red Gate Software would prefer that you pay for .NET Reflector Pro to get this capability).
I was also able to demonstrate the much-overlooked static analysis capabilities of .NET Reflector. They enable developers to learn what code depends on particular methods, where they’re used, what code exposes them, and where they’re called. It turns out that there’s a free plug-in which extends this capability even further. The Dependency Structure Matrix (DSM) plug-in allows developers to generate and manipulate matrices for determining the level of complexity in a software architecture. Considering that a full-featured tool with DSM capability like NDepend costs hundreds of dollars per seat license, even a fraction of those features from a free .NET Reflector plug-in are well worth the time to learn how to leverage.
More Q & A
When I showed that .NET Reflector pulled in source code comments in its display of disassembled code, one member of the audience pointed out (correctly) that this was only possible because the XML file containing the comments was co-located with the assembly. When I tested the disassembly capability afterwards without the XML file present, the comments didn’t display.
There was also a question from the audience about how .NET Reflector compared with ILDASM (which ships with the .NET Framework). The short answer is that ILDASM is far more limited by comparison. It only decompiles to IL, it lacks the analysis capabilities of .NET Reflector, and most importantly it doesn’t have a plug-in architecture to enable expansion of what it can do.
Conclusion
My presentation on January 12 only scratched the surface of what .NET Reflector can do. I hope this post has added more depth, and piqued your curiosity to use the tool to improve your own development experience. You may find the following links helpful in leveraging .NET Reflector for your own development work:
- Red Gate (http://www.red-gate.com/products/dotnet-development/reflector/)
- Wikipedia (http://en.wikipedia.org/wiki/.NET_Reflector)
- Stackoverflow (http://stackoverflow.com/questions/408167/examples-of-using-net-reflector)
- Codeplex (http://reflectoraddins.codeplex.com/)
- Oxygene (http://en.wikipedia.org/wiki/Oxygene_(programming_language))
- C# in Depth (http://csharpindepth.com/Downloads.aspx)
- Jon Skeet (http://msmvps.com/blogs/jon_skeet/Default.aspx)
- Snippy Reflector Add-In (http://msmvps.com/blogs/jon_skeet/archive/2008/11/23/the-snippy-reflector-add-in.aspx)
Deleting TFS Tasks
I changed the state of a TFS task I was working on recently, only to discover the workflow wouldn’t let me return it to it’s prior state. Until today, I didn’t know it was possible to delete TFS tasks if you made a mistake in changing one. But some Googling revealed a blog post that explained how to delete tasks.
The direction of the slashes for the URL can point forward (/) instead of backward () and the witadmin.exe destroywi /Collection:<TFS url> /id:<Task id> command still works.
Filtering Heterogeneous Arrays in .NET
One of the bugs I was recently asked to fix for an application required me to determine whether or not to display one of the members of a list. This proved somewhat challenging since the lists in question were heterogeneous (two difference subtypes of an abstract base class). It turned out that LINQ provides a nice solution to this sort of problem in the form of the OfType<T> method.
Given an IEnumerable collection with elements of multiple types, calling OfType<T> on the collection where T is the desired type will return a collection containing only elements of type T. Before learning about OfType<T>, I’d been using the Cast<T> method. This was fine as long as all the collection elements were of the type T I wanted. The moment this wasn’t the case, my LINQ query threw a cast exception. OfType<T> seems to work similarly to the “as” operator in C#, in that it doesn’t complain if a list element isn’t type T–it simply excludes it from the returned collection.
PowerGUI and .NET Framework 4.0
On my current project, we use PowerShell scripts to automate our UI testing. We’ve been writing and running the scripts in the PowerGUI Script Editor, an excellent tool that’s also free. When we upgraded our application to run on version 4.0 of the .NET Framework from 3.5, we lost the ability to run PowerShell scripts in debug mode from PowerGUI.
The only work-around for this I’ve found (at least until a version of PowerGUI built on .NET 4.0 comes out), is a registry hack that forces all the .NET apps on the machine to use the latest version of the CLR. You can find more details in this user discussion at powergui.org, or this discussion on stackoverflow.com.
New MSBuild 4.0 Features
My current assignment has me working with the application build again. MSBuild 4.0 got a number of new features which I’m only now getting to take advantage of. The coolest one so far is the extensible task factory. It lets you define custom MSBuild tasks right in the project file, instead of having to create a separate DLL for them. Andrew Arnott wrote a custom factory that lets you define custom MSBuild tasks using PowerShell. Because we’re using PowerShell for UI automation testing of our application, we’ll soon be able to integrate those tests into our build system.
Another feature I just learned about is conditional constructs. They provide functionality equivalent to a case statement you might see in other languages. It’s much more flexible than the Condition attribute available on most MSBuild tasks.
ScrollViewer+ItemsControl vs. ListView
One of my most recent tasks at work was determining the cause of slow performance in one part of an application and coming up with a fix (if possible). We tracked the source of the problem down to a use of ItemsControl inside a ScrollViewer. Because the ItemsControl instance was trying to display hundreds of complex items, it took a noticeably long time to load. This turns out to be a known issue, with a few possible solutions. Simply changing the ItemsPanelTemplate of the ItemsControl instance to contain a VirtualizingStackPanel didn’t fix our performance problem.
What did resolve our performance issue was replacing the ScrollViewer and ItemsControl combination with a ListView. The list of what we changed includes:
- Giving the ListView the same name as the ItemsControl.
- Giving the ListView the same ItemsSource as the ItemsControl.
- Update the ItemsPanelTemplate of the ListView to use VirtualizingStackPanel.
- Set HorizontalScrollBarVisibility to "Disabled".
- Bound the Visibility property of the ListView to a Converter.
- Update the ItemContainerStyle with a ListViewItem style that sets the HighlightBrushKey and ControlBrushKey to be transparent.
The changes we made reduced the load time from around 20 seconds down to less than 2 seconds for 400 items.
The tradeoff in moving to a ListView (with VirtualizingStackPanel) from ScrollViewer+ItemsControl is scrolling speed. Scrolling through 400 items does go more slowly, but it’s preferable to waiting as long as we did just to see the data.
My First PowerShell Cmdlet
We’ve been using PowerShell to write automated tests of the UI on my current project. One of the tasks I took on today was creating a custom cmdlet to enable us to select radio buttons.
I already had an existing assembly of cmdlets to work with, so I just added a new class (SelectRadioButton) to it. Next, I added references to System.Management.Automation and System.Windows.Automation. With these references in place, I could add this attribute to the class:
[Cmdlet(VerbsCommon.Select, "RadioButton", SupportsShouldProcess = true)]
The attribute determines the actual name of the cmdlet you'll use in scripts (Select-Radiobutton). The cmdlet needs an instance of AutomationElement to operate on, so that's defined next:
[Parameter(Position = 0, Mandatory = true, HelpMessage = "Element containing a radio button control")]
[ValidateNotNull]
public AutomationElement Element { get; set;}
Finally, I adapted some of the logic for my override of the ProcessRecord from this article on using UI automation. The end result looks something like this:
protected override void ProcessRecord()
{
try
{
if (Element.Current.ControlType.Equals(ControlType.RadioButton))
{
SelectionItemPattern pattern = Element.GetCurrentPattern(SelectionItemPattern.Patern) as SelectionItemPattern;
if (pattern != null) pattern.Select();
}
else
{
//Put something in here for handling something other than a RadioButton
}
}
catch (Exception ex)
{
// You could put some logging here
throw;
}
}
When default settings attack
When you first install SQL Server 2008 Express, the TCP/IP protocol is disabled by default. Be sure the protocol is enabled (which requires restarting the service) before you try to run an application that depends on it, otherwise you could spend hours trying to figure out why your application won’t work. It looks like SQL Server 2008 R2 Developer behaves the same way.
I suggested this awhile back to a co-worker who’d been struggling all day with why an application wasn’t working, and it turned out to be the solution.
Going beyond files with ItemGroup
If you Google for information on the ItemGroup element of MSBuild, most of the top search results will discuss its use in dealing with files. The ItemGroup element is far more flexible than this, as I figured out today when making changes to an MSBuild script for creating databases.
My goal was to simplify the targets I was using to create local groups and add users to them. I started with an example on pages 51-52 of Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build, where the author uses an ItemGroup to create a list of 4 servers. Each server has custom metadata associated with it. In his PrintInfo target, he displays the data, overrides an element with new data, and even removes an element. Because MSBuild supports batching, you can declare a task and attributes once and still execute it multiple times. Here’s how I leveraged this capability:
- I created a target that stored the username of the logged-in user in a property.
- I created an ItemGroup. The metadata for each group element was the name of the local group I wanted to create.
- I wrote the Exec commands to execute on each member of the ItemGroup
<ItemGroup> <LocalGroup Include="Group1"> <Name>localgroup1</Name> </LocalGroup> <LocalGroup Include="Group2"> <Name>localgroup2</Name> </LocalGroup> <LocalGroup Include="Group3"> <Name>localgroup3</Name> </LocalGroup> </ItemGroup>The Exec commands for deleting a local group if it exists, creating a local group, and adding the logged-in user to it, look like this:
<Exec Command="net localgroup %(LocalGroup.Name) /delete" IgnoreExitCode="true" /> <Exec Command="net localgroup %(LocalGroup.Name) /add" IgnoreExitCode="true" /> <Exec Command="net localgroup %(LocalGroup.Name) $(LoggedInUser) /add" />The result is that these commands are executed for each member of the ItemGroup. This implementation makes a lot easier to add more local groups if necessary, or make other changes to the target.
More on migrating partially-trusted managed assemblies to .NET 4
Some additional searching on changes to code access security revealed a very helpful article on migrating partially-trusted assemblies. What I posted yesterday about preserving the previous behavior is found a little over halfway through the article, in the Level1 and Level2 section.
One thing this new article makes clear is that use of SecurityRuleset.Level1 should only be used as a temporary measure until code can be updated to support the new security model.
Upgrading .NET assemblies from .NET 3.5 to .NET 4.0
Code access security is one area that has changed quite significantly between .NET 3.5 and .NET 4.0. Once an assembly has been upgraded, if it allowed partially-trusted callers under .NET 3.5, it would throw exceptions when called under .NET 4.0. In order to make such assemblies continue to function after being upgraded, AssemblyInfo.cs needs to change from this:
[assembly: AllowPartiallyTrustedCallers]to this:
[assembly: AllowPartiallyTrustedCallers] [assembly: SecurityRules(SecurityRuleSet.Level1)]Once this change has been made, the assembly will work under the same code access security rules that applied prior to .NET 4.0.
When 3rd-party dependencies attack
Lately, I’ve been making significant use of the ExecuteDDL task from the MSBuild Community Tasks project in one of my MSBuild scripts at work. Today, someone on the development team got the following error when they ran the script:
"Could not load file or assembly 'Microsoft.SqlServer.ConnectionInfo, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'It turned out that the ExecuteDDL task has a dependency on a specific version of Microsoft.SqlServer.ConnectionInfo deployed by the installation of SQL Server 2005 Management Tools. Without those tools on your machine, and without an automatic redirect in .NET to the latest version of the assembly, the error above results. The fix for it is to add the following XML to the "assemblyBinding" tag in MSBuild.exe.config (in whichever .NET Framework version you're using):
<dependentAssembly> <assemblyIdentity name="Microsoft.SqlServer.ConnectionInfo" publicKeyToken="89845dcd8080cc91" culture="neutral" /> <bindingRedirect oldVersion="0.0.0.0-9.9.9.9" newVersion="10.0.0.0" /> </dependentAssembly>Thanks to Justin Burtch for finding and fixing this bug. I hope the MSBuild task gets updated to handle this somehow.
Continuous Integration Enters the Cloud
I came across this blog post in Google Reader and thought I’d share it. The idea of being able to outsource the care and feeding of a continuous integration system to someone else is a very interesting one. Having implemented and maintained such systems (which I’ve blogged about in the past), I know it can be a lot of work (though using a product like TeamCity lightens the load considerably compared with CruiseControl.NET). Stelligent isn’t the first company to come up the idea of CI in the cloud, but they may be the first using all free/open source tools to implement it.
I’ve read Paul Duvall’s book on continuous integration and highly recommend it to anyone who works with CI systems on a regular basis. If anyone can make a service like this successful, Mr. Duvall can.
Set-ExecutionPolicy RemoteSigned
When you first get started with PowerShell, don’t forget to run ‘Set-ExecutionPolicy RemoteSigned’ from the PowerShell prompt. If you try to run a script without doing that first, expect to see a message like the following:
File <path to file> cannot be loaded because execution of scripts is disabled on this system. Please see "get-help about_signing" for more details.The default execution policy for PowerShell is "Restricted" (commands only, not scripts). The other execution policy options (in decreasing order of strictness) are:
- AllSigned
- RemoteSigned
- Unrestricted
Can't launch Cassini outside Visual Studio? This may help ...
I’d been trying to launch the Cassini web server from a PowerShell script for quite awhile, but kept getting an error when I tried to display the configured website in a browser. When I opened up a debugger, it revealed a FileNotFoundException with the following details:
"Could not load file or assembly 'WebDev.WebHost, Version=8.0.0.0, Culture=neutral, PublicKeyToken=...' or one of its dependencies..."Since the WebDev.WebHost.dll was present in the correct .NET Framework directory, the FileNotFoundException was especially puzzling. Fortunately, one of my colleagues figured out what the issue was. WebDev.WebHost.dll wasn't in the GAC. Once the file was added to the GAC, I was able to launch Cassini and display the website with no problems.
Unit testing strong-named assemblies in .NET
It’s been a couple of years since I first learned about the InternalsVisibleTo attribute. It took until this afternoon to discover a problem with it. This issue only occurs when you attempt to unit test internal classes of signed assemblies with an unsigned test assembly. If you attempt to compile a Visual Studio solution in this case, the compiler will return the following complaint (among others):
Strong-name signed assemblies must specify a public key intheir InternalsVisibleTo declarations.Thankfully, this blog post gives a great walk-through of how to get this working. The instructions in brief:
- Sign your test assembly.
- Extract the public key.
- Update your InternalsVisibleTo argument to include the public key.
A .NET Client for REST Interface to Virtuoso
For my current project, I’ve been doing a lot of work related to the Semantic Web. This has meant figuring out how to write SPARQL queries in order to retrieve data we can use for testing our application. After figuring out how to do this manually (we used this SPARQL endpoint provided by OpenLink Software), it was time to automate the process. The Virtuoso product has a REST service interface, but the only example I found here for interacting with it used curl. Fortunately, some googling revealed a really nice resource in the Yahoo Developer Network with some great examples.
I’ve put together a small solution in Visual Studio 2008 with a console application (VirtuosoPost) which executes queries against http://dbpedia.org/fct/ and returns the query results as XML. It’s definitely not fancy, but it works. There’s plenty of room for improvement, and I’ll make updates available here. The solution does include all the source, so any of you out there who are interested in taking the code in an entirely different direction are welcome to do so.
Adventures in SPARQL
If this blog post seems different than usual, it’s because I’m actually using it to get tech support via Twitter for an issue I’m having. One of my tasks for my current project has me generating data for use in a test database. DBPedia is the data source, and I’ve been running SPARQL queries to retrieve RDF/XML-formatted data against their Virtuoso endpoint. For some reason though, the resulting data doesn’t validate.
For example, the following query:
PREFIX owl: <[www.w3.org/2002/07/o...](http://www.w3.org/2002/07/owl#>) PREFIX xsd: <[www.w3.org/2001/XMLS...](http://www.w3.org/2001/XMLSchema#>) PREFIX rdfs: <[www.w3.org/2000/01/r...](http://www.w3.org/2000/01/rdf-schema#>) PREFIX rdf: <[www.w3.org/1999/02/2...](http://www.w3.org/1999/02/22-rdf-syntax-ns#>) PREFIX foaf: <[xmlns.com/foaf/0.1/...](http://xmlns.com/foaf/0.1/>) PREFIX dc: <[purl.org/dc/elemen...](http://purl.org/dc/elements/1.1/>) PREFIX : <[dbpedia.org/resource/...](http://dbpedia.org/resource/>) PREFIX dbpedia2: <[dbpedia.org/property/...](http://dbpedia.org/property/>) PREFIX dbpedia: <[dbpedia.org/>](http://dbpedia.org/>) PREFIX skos: <[www.w3.org/2004/02/s...](http://www.w3.org/2004/02/skos/core#>) SELECT ?property ?hasValue ?isValueOf WHERE { { <[dbpedia.org/resource/...](http://dbpedia.org/resource/Bank>) ?property ?hasValue FILTER (LANG(?hasValue) = 'en') .} UNION { ?isValueOf ?property <[dbpedia.org/resource/...](http://dbpedia.org/resource/Bank>) } }generates the RDF/XML output here. If I try to parse the file with an RDF validator (like this one, for example), validation fails. Removing the attributes from the output takes care of the validation issues, but what I'm not sure of is why the node ids are added in the first place.