Friday, May 6, 2011

Fail Loop


This gem is part of a code (C#) example that we got from a vendor recently. This exact same pattern appears multiple times in the example.

It stands out because it does manage work correctly...


using (FileStream rdr = new FileStream(contentInfo.FullName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{                    {
    index = 1;
    while (true)
    {
        long offset = (index - 1) * chunkSize;
        long remaining = contentInfo.Length - offset;
        if (remaining < 1) break; // Stop processing, at the end of the file

        // stuff that doesn't ever touch the index variable 

        index++;
    }
}

Friday, March 11, 2011

Gasoline Boycotts - How much do big oil companies care?

Recently I've gotten a rash of Facebook event invites, emails, and other general online suggestions that I should participate in various gas boycotts. The proposal is that we all setup a a single day where no one in the country buys any gasoline at all.

Sadly, there is no single organizer for these protests, so there are literally hundreds of different groups advocating boycotts on different dates. But, aside from the logistical challenge, I'm still confused as to exactly how  people think this kind of boycott would achieve anything.

These groups either claim to want to "send a message" to the "big oil companies", or they claim that this will "hit them in their bottom-line". These groups are run by idiots.

How does not buying gas on one day matter to the "big oil companies"?

Have you actually thought about this at all? It isn't like I buy my gas straight from big oil... sure, the local gas station's signs have BP logos all over them, but BP doesn't actually own and operate those individual gas stations. It isn't like the money flows from my debit card straight into some big oil company's bank account.

Sure, the gas station owners would really notice, and they may even be inconvenienced slightly. But the gas stations aren't the ones that set your gasoline prices.  And gas stations don't pay their big oil suppliers on a daily basis either; they tend to pay their gasoline bills on a monthly or quarterly basis.

It really doesn't matter to the big oil company on which specific day any particular individual choses to pump gas into their little car.

And what message are you planning to send with this boycott? It isn't like people are advocating a day of not driving, so just as much gasoline would be used up that day anyway; boycott or not. The small fraction of people that would normally have chosen to buy their gas on boycott day would just have to fill up the tank a day earlier or later; but they'd need to buy exactly the same amount as they would have had there been no boycott at all.

Unless you actually reduced the actual usage of gasoline, it doesn't matter exactly when you buy the gas. But  even if you did manage to get everyone in the U.S. to stop using gasoline for one day the best you could achieve by a one day reduction in gasoline usage would be a 0.27% drop in oil company profits (and that, only if you make the false assumption that their profits only come from gasoline sales).

Let's put that in perspective...

Let's assume that we had boycotted, successfully, for one full day last year. And we further assume that during the boycott we also didn't use any gas either. And then we made the (blatantly false) assumption that all oil company profits come from U.S. gasoline sales.

How much would it matter to big oil?

Well, using the simplest math possible; such a boycott in 2010 would have reduced Exxon Mobile's bottom-line by around 67 million dollars. So yeah, Exxon would have only made 30.43 billion in profit, instead of the full 30.5 billion that they actually reported last year.

I'm sure big oil is just terrified of those gasoline boycotts and is slaving away right now trying to find some way to appease those disgruntled U.S. motorists.

Saturday, February 26, 2011

Review: TuneUp - Organizing My Music Library

If you are like me, you have a massive music collection with tens of thousands of amazing songs; all of which you purchased completely legally (of course). If your collection is like mine though, it is also an organizational catasterfuck... duplicates, mis-tagged songs, missing artwork, random file names, etc.

So, if you find yourself facing the daunting task of organizing a massive collection of music, then you should totally buy this shit from TuneUp Media. You managed to steal buy over 100 GB of music, so the least you can do is cough up $20-$30 so you can tag and organize your collection.

TuneUp is an add-on for Windows Media Player and iTunes (same app, works with either/both).

I had no luck getting it to work with Windows Media Player reliably, but I HATE WMP, so... no loss there really. As an add-on for iTunes though, it works really well.

I really don't like iTunes either, at least not as a daily-use media player, but for the task of organizing a large library I have been reasonably impressed with it's ability. There are better general tools, but when you add-on TuneUp to iTunes it goes from being merely a capable library manager, to being outright amazing!

To perform my tuning, I used this procedure:

  1. Clear everything from the iTunes library
  2. Set TuneUp to write a message to the comments of the songs it tags (so you can see in the library which files you've updated).
  3. Modify iTunes library details view to display the comments column 
  4. Copy all of your busted music collection into the "MyMusic" folder in windows
  5. Drag 5 folders from MyMusic (in windows explorer) and drop them into the iTunes library window
  6. Select the songs in iTunes, right-click, and choose "get info"; check the box for "comments" and save the changes  to remove any existing comments from the songs.
  7. Drag the files from the iTunes library and drop them into the TuneUp's "clean" window
  8. Let TuneUp do it's thing and locate the info for your songs, then save all the appropriate suggestions from TuneUp (this updates the tags, and syncs the changes back to iTunes)
  9. Make any manual changes you need using iTunes media library (it actually has decent mass-tagging features)
  10. Select the newly fixed files in iTunes, right-click, and choose "consolidate files". This copies the files to the iTunes media folder.
  11. Copy the fixed-up folders and files (in windows explorer) from the iTunes media folder to an external drive or some other destination folder where you want to keep the polished-up collection.    
  12. Select all files in the iTunes library and delete them; choose send to recycle bin. This gets iTunes ready for the next batch of files you want to sort out. 
  13. Repeat from step 5 for the remainder of your collection.

TuneUp isn't perfect, but it has access to a very nice range of online music databases, and it also performs a playback analysis of your music, so it doesn't rely just on the existing tags and file names to match the files to the online databases. Because it listens to the files and uses existing tags and filenames too; it is very good at tracking down the right CD info for you. But sometimes, you'll still need to manually manage parts of your collection. The nice thing about TuneUp is that it doesn't automatically write changes; it tells you what it came up with, then lets you decide if you want to save the changes back to the files.

TuneUp does other stuff too. It has features I personally don't care about such as locating concerts, suggesting similar music to what you are listening to, and even the obligatory social networking stuff. But for organizing a media library, it has more than earned the $30 I spent on the full license. The fact that it does other stuff might matter to you though; especially if you are debating the annual license ($20) vs. the full unlimited use license ($30).
 

Friday, August 20, 2010

Microsoft Live - Why I don't use it

Microsoft has been busy doing all kinds of neat stuff with their Live services. Mostly Live has always been an all-around disappointment. But recently, they've gotten competitive with Google Docs. I suppose that's because Google Docs is a very serious threat to the MS Office empire.

The new SkyDrive and Online Office 2010 stuff is fantastic. In many respects it kicks Google Docs in the teeth. Google may be the king of the cloud, but Microsoft does pretty UI's second only to Apple, while Google's UIs tend to suck-balls. Normally Google's minimalist approach works well, but not with stuff like this.

So it is possible that Microsoft could eat Google's face with Live if they wanted to.

But they aren't eating Google's face. Why? Well, because no one wants to use it... and probably for the same reasons I don't use Live myself. It basically boils down to just two problems:

  1. The Advertising. Look, I know Microsoft wants to make ad revenue. That's what Google's doing after all, and it's working. But come on! Live uses an absurd amount of screen space for their ads. The ads are so in-your-face that you can't help but want to vomit every time you open the site.   
           
  2. The advertising. Did I mention how distracting and annoying the advertising was? The entire Live setup is designed to make you switch from one page to another every time you try to do anything (unlike Google, which lets you do a lot from oneas intuitive page). This is transparently a gimmick to let them show you a new ad on each page... not because it makes any sense from a user interface perspective.
      
It's sad really... Microsoft has some really awesome cloud services, and they always have. But between their tarnished reputation, failure to market their products, and then finding some super annoying way to shoot themselves in the face, it just seems that no one really cares about Live. Unless someone over at MS gets a clue soon, I doubt they ever will either.

Pitty...

I expect that with slow adoption, Microsoft may do what they've been done with everything else that they've had trouble selling recently... find some way to tie it to X-Box Live, which is about the only Live related service anyone has ever cared about. Maybe when they start giving out achievements and unlocks when you level up your spreadsheet, then someone might be entertained enough to look past the rest of the crap.

Friday, July 9, 2010

MEF and MVC - Limitations and workarounds for partial trust environments

A while back I wrote about using MEF in MVC environments with extensions provided by Hammet for the Nerd Dinner MEF sample application. Those extensions deal with dynamic discovery of parts based on MVC conventions (instead of attributes), as well as per-request composition containers. The extensions work great, after a few modifications that I talked about in the last post... but in partial trust environments it blows up in your face!

BOOM!

I spent hours and hours digging through the code, reading about CAS, trust policies, transparent code and whole mess of other junk that I'd really rather not have rattling around my skull. Long story short -- MEF isn't friendly with partially trusted asp.net environments.

Now, you could write your custom MEF code in a class library, flag the assembly with the APTCA attribute, sign-it, and install it to the GAC if you want. That will get around these limitations neatly, but if you are running in partial trust you probably don't have the luxury of installing things to the GAC either.

The first major limitation is that you cannot access the parts collection within catalogs or containers. If you try it, your get an exception like this:

Attempt by method 'DynamicClass.lambda_method(System.Runtime.CompilerServices.Closure)' 
to access type 'System.ComponentModel.Composition.Hosting.ComposablePartCatalogCollection' failed.
  
  
The easiest way to reproduce the problem is to simply add the trust element to web.config like this:

<trust level="High"/>
  
  
Then add this to application startup in global.asax:

 var cat = new DirectoryCatalog(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "bin"));
 var parts = cat.Parts.ToArray();
  
   
In the Nerd Dinner MEF sample, this limitation effectively kills the mechanisms that slit up parts into per-request containers vs. application wide containers.

If you've done any reading about MEF online, you've likely run across code for a FilteredCatalog class. This thing is so commonly cited on the net that it seems beyond retarded that it wasn't built-in with MEF. But these limitations from partial trust kills FilteredCatalog; which the Nerd Dinner MEF sample uses heavily.

The other major area of limitation is that you cannot use ReflectionModelServices, which is needed in order to dynamically create export/import definitions programmatically. This kills the Nerd Dinner MEF sample's auto-discovery of controllers.

Despite these limitations, you can still use MEF in medium trust, but only if you are careful to keep it simple and straight forward.

Honestly, I recommend that you just use Ninject or a similar IoC/DI framework until the next version of MEF or MVC (hopefully) fixes these issues.

In my case though, I really wanted to be able to support medium trust environments and I'm too damned stubborn to give up on MEF that easy.

I'm OK with having to use the MEF attributes to decorate my controllers, so losing auto-discovery isn't much of a problem. Hammet's extensions are brilliant, but the auto-discovery mechanism is a lot of VERY complicated experimental code.

Now, the simplest thing you can do is just use a custom MVC ControllerFactory that instantiates a new MEF container on each request. That works well, and is trivially easy to implement:

public class MefControllerFactory : IControllerFactory
{
    public IController CreateController(RequestContext requestContext, string controllerName)
    {
        var catalog = new DirectoryCatalog(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "bin"));
        var requestContainer = new CompositionContainer(catalog);
        var controller = requestContainer.GetExportedValue(controllerName);

        if (controller == null){ throw new HttpException(404, "Not found");}
        
        return controller;
    }
}

Sure, this is fine, but it sort of undermines a lot of the power of MEF. MEF's default behavior uses a singleton pattern to reuse parts that have already been instantiated, but this mechanism eliminates ALL reuse, by recombobulating the entire container on each request. It also has an appreciable performance impact since reflection has to go over and build up the entire catalog each time too.

Another solution is to just create an application wide container, and just keep the controllers from being reused by setting the PartCreationPolicy attribute to NonShared. That's a better solution, and simple to achieve too. It looks something like this:

public static class ContainerManager
{
    private static CompositionContainer _container;
    public static CompositionContainer ApplicationContainer
    {
        get
        {
            if (_container == null)
            {
                var catalog = new DirectoryCatalog(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "bin"));
                _container = new CompositionContainer(catalog);
            }
            return _container;
        }
    }
}

Then your controller just uses the application container from this static class. Very simple, and allows you to control reuse using MEF's standard attributes.

I actually recommend the above approach, but it bothered me to mark controllers as NonShared. It isn't that controller instances cannot be reused, it's just that in MVC they can't be reused across multiple requests.
So I came up with a more ghetto solution that can sort-of mimic a FilteredCatalog even in medium trust. This allows for a pattern more similar to the Nerd Dinner MEF sample; you can have application scoped containers, and smaller per-request containers for just for the controllers too.

It looks a little something like this:

First create a class derived from HttpApplication so you can boot-strap creating the MEF containers and catalogs on application startup:

public class MefHttpApplication : HttpApplication
{
    public static ComposablePartCatalog RootCatalog { get; private set; }
    public static CompositionContainer ApplicationContainer { get; private set; }
    public static ComposablePartCatalog ControllerCatalog { get; private set; }

    protected virtual void Application_Start()
    {
        if (RootCatalog == null){ RootCatalog = CreateRootCatalog(); }
        if (ApplicationContainer == null)
        {
            ApplicationContainer = new CompositionContainer(RootCatalog, false);
        }
        if (ControllerCatalog == null)
        {
            var controllerTypes = Assembly.GetExecutingAssembly().GetTypes().Where(t => t.GetInterfaces().Any(i => i == typeof(IController)));
            ControllerCatalog = new TypeCatalog(controllerTypes);
        }
        ControllerBuilder.Current.SetControllerFactory(new MefControllerFactory());
    }

    protected virtual void Application_End()
    {
        if (ApplicationContainer != null){ApplicationContainer.Dispose();}
    }

    protected virtual ComposablePartCatalog CreateRootCatalog()
    {
        return new DirectoryCatalog(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "bin"));
    }
}

On startup we create a master catalog of every part definition, and an application scoped container from that master catalog. But we also create a catalog containing just the controller parts by using a bit of reflection to pull out just controllers and shoving them into a TypeCatalog (which is built-in with MEF)... the poor man's filtered catalog!

Now just doctor up Global.asax to inherit the MefHttpApplication class:

public class MvcApplication : MefHttpApplication
{
    public static void RegisterRoutes(RouteCollection routes)
    {
       //normal route stuff
    }

    protected override void Application_Start()
    {
        base.Application_Start();
        AreaRegistration.RegisterAllAreas();
        RegisterRoutes(RouteTable.Routes);
    }
}

And finally, we need our ControllerFactory:

public class MefControllerFactory : IControllerFactory
{
    public IController CreateController(RequestContext requestContext, string controllerName)
    {
        var requestContainer = GetRequestControllerContainer(requestContext.HttpContext.Items);
        var controller = requestContainer.GetExportedValue(controllerName);

        if (controller == null){throw new HttpException(404, "Not found");}

        return controller;
    }

    public void ReleaseController(IController controller){/*nothing to do*/}

    public static CompositionContainer GetRequestControllerContainer(IDictionary contextItemsCollection)
    {
        var app = (MefHttpApplication)HttpContext.Current.ApplicationInstance;

        if (contextItemsCollection == null) throw new ArgumentNullException("dictionary");

        var container = (CompositionContainer)contextItemsCollection["MefRequestControllerContainer"];

        if (container == null)
        {
            container = new CompositionContainer(MefHttpApplication.ControllerCatalog, false, MefHttpApplication.ApplicationContainer);
            contextItemsCollection["MefRequestControllerContainer"] = container;
        }
        return container;
    }
}

As you can see, the overall technique here is similar to that used in the Nerd Dinner MEF sample. We have a static method that we can call to build a per-request container. It stuffs the entire container into context in case its needed later. The key to the container itself is that it is built from our catalog of just controller types, and uses the application scoped MEF container as an export provider for any other parts the controllers might need to import.

In the long-run, this is probably no better than just marking our controllers as NonShared and using an application wide container, but the general concept of this technique can be applied to other situations besides just dependency injection with controllers. While you can't truly filter catalogs and manipulate part in partial trust, you can still use reflection to create specialized catalogs and achieve similar results... for the simpler cases anyway.