Monday, December 17, 2007

Graffiti Beta 1 - Review (part 1)

As some of you may have noticed, Reddnet.net got a face-lift last week.

I've migrated off of the BlogEngine code base (which I reviewed a while back) in favor of Telligent's new Graffiti CMS product. So far Graffiti is making me quite happy even though it is only in beta 1, but a few important features are still missing.

I thought I'd go ahead and do a review on it, but once I started writing I ended up with a lot to say. So I'll be splitting the review into a several parts over the next week or two.

So for the first part let me just tell you a little about Graffiti and Telligent in general.

Graffiti - It's not a blog exactly...

 Graffiti CMS is the newest product from Telligent, the folks that brought us Community Server.

 CMS is an industry standard acronym for "Content Management System", but with Graffiti the "CMS" part of this product's name supposedly stands for "Content Made Simple" instead. The dual meaning is quite deliberate, and Telligent is promoting Graffiti as a light-weight Content Management System even though they are playing around with the acronym.

 Graffiti is somewhere between a simple HTML content platform and a traditional CMS system.

 I should say up-font that I'm very skeptical of anything that calls itself a web based CMS system. The web itself IS a CMS system by design, and so the only real value anything can add to that is just gravy. CMS as a term is kinda hard to define. But generally CMS systems manifest by giving the site operator tools to organize, control, and search content once you've accumulated a metric ass-ton of it. Of course, any web platform has to provide this kind of functionality so generally when you hear someone talk about CMS they are either stating the obvious, or just talking out of their ass.

 But even though CMS is a flimsy marketing word there are a couple of things a good "CMS" will have to provide. The most important is tools for creating content that is of high quality. That means a simple and effective online content editor and usually also includes revision tracking, and content review and approval (work-flow) features. On the other end, a CMS should provide some mechanisms for organizing content. Usually that means categorization, labeling of some sort, reporting, and searching functionality. All this is necessary to keep old content from being buried under a mountain of newer stuff.

 As I said though, any decent web platform has to do this. The only difference between a "CMS" and an online catalog or online discussion forum is the kind of content being dealt with. In Graffiti's case, the content being managed is HTML "posts" like articles, blog posts, and similar.
 
 In beta 1, Graffiti has a good start on the publishing end of things. It provides the essential tools to easily and quickly creates content, and it presents that content in an effective way. It also presents content over RSS feeds as well as HTML web pages. It is also highly optimized to search engines and has a beautifully simple way of managing URLs automatically so they remain very human friendly.

 But the bulk of the heavy CMS features such as revisioning, review, and moderation are incomplete in beta 1. The searching, sorting, and categorization of content is present, but also not quite complete yet. There are a few rudimentary reports in beta 1 but presently this is limited mostly to summary traffic information... Basically beta 1's reporting amounts to a glorified "view counter", but the reports are very pretty.

 Fortunately for smaller organizations or casual bloggers, the heavy CMS stuff is unobtrusive, optional, and if used will add a lot of value without adding a lot of complexity.

 While Telligent is very careful to make sure everyone knows that Graffiti can do things other than blogs, the reality is that Graffiti's design is very much blog-inspired. The only real difference between Graffiti and most other blog kits is that Graffiti is a tad more polished, more flexible, and built to be extended in other directions. But the core is very bloggish.

 The light-weight CMS components will probably allow Graffiti a good bit of success in other markets such as for news and other "article" based sites as well as in product/service support applications as a knowledge base. I fully expect to see Graffiti add simple forums and wiki functionality very soon after the initial RTM release. If not, I expect the developer community will probably quickly move to supply such as add-ons.

 But the largest market for Graffiti will certainly be the bloggers. Beta 1 lacks some essential blog related functionality though. The biggest for me is support referral services and trackbacks. But those are promised for the final release, and  it already has rock solid posting and RSS support. 
About Telligent:
 I'm a huge fan of Telligent and I've been closely following their work since the company first formed a few years ago. If you were to ask me where I'd most likely seek employment if I were to relocate out of South Carolina, then I'd give you a list of companies with Telligent's name at the top. They are a great group of very talented people and they make some of the best code on the planet!

 On the flip side though, I've never been that big a fan of Telligent's flagship product, Community Server. CS is just too big, heavy, and purpose built for my tastes. CS is most certainly the best large scale forums and blog hosting platform on the market. CS does what it does very well, but it also sticks to its niche pretty tightly and is hard to adapt, especially in smaller environments.

 As a developer, I've always found extending and customizing CS to be... well... less than fun. It is quite painful and difficult to work with and requires a strong grasp of both .net as well as the CS product's internal architecture. CS has gotten much easier to develop against over the years though, but also during that time it has grown into a monstrous beast with about a dozen optional enterprise level add-ons. So while developing for CS may be easier than in the past, there is a lot more to develop against too. CS is almost so big that you could make an entire career out of just developing CS sites.... kinda like sharepoint, which is another product I'm not all that fond of --and for many of the same reasons.

Sunday, December 16, 2007

The Waffle House Code Engine

I spend a lot of time at my local Waffle House writing code. I know, not the most up-scale choice, but unfortunately a necessary one considering the lack of other good options these days.

For those unfamiliar with it, Waffle House is a very large chain of greasy-spoon diners. It is nearly a national chain, but there are a few areas that haven't been graced with a waffle house yet but most places have something similar.

Waffle houses are very small restaurants seating around 25 people comfortably. They are usually located on major interstates or within major cities. They are also notoriously dirty places, with generally crappy food guaranteed to cause heat disease nearly instantly. The regular clients of such places range from retired old men to drunken motards with a lot of not much in-between.

On the plus side though, waffle houses are cheap places to eat (more or less), they serve great coffee. For their employees the house pays weekly, and they pay in cash too. This makes it the ideal job for drug-addicts, escaped convicts, illegal immigrants, and anyone whose idea of a "savings" account includes a mattress and some coupon books.

It wasn't always this way for me.

I used to hang out in higher-class establishments like IHOP or Denny's. Hey! I didn't say "high" class, just higher class than the awful waffle. But I'm a smoker, and I love sitting around with my laptop writing code while drinking coffee and smoking at 3am, and for that there just aren't many class-establishments to choose from.

It just works for me.

I've tried this at home, but all that mess about having to get up and make coffee, pour it myself when my cup runs empty... uug... and I don't smoke in the house so I have to actually go out to the garage when I want a smoke. Also, at home I have all my other stuff handy like the TV, a fast internet connection, and a lot of neglected chores that need doing.

So its easy to get distracted.

At the office, things are better. But there I have co-workers that need working with, my end-users have questions to ask, bugs to report, and just general problems that need solutions. Plus there is my boss who needs status updates and has new ideas and features to propose and some other projects to discuss. Then there are meetings and the occasional talking to the customer too.

In short:

if(codeAtHome == null && codeAtWork < optimal)
{
    DoWorkAtWaffleHouse();
}

So while I get code done at the office, and less often even at home, I still do my best work after-hours in places where there are people getting paid to look after me. Plus the staff makes good entertainment when you need to take a break from the screen for a bit.

But the anti-smoking movement is on its way, and I'm sure it wont last much longer, every year there is more and more political noise in the direction of banning smoking in all public places. At this point it may just be a matter of time. So once I either quit smoking (which I'm planning to do this year) or they finally ban it, the only real thing that changes is that I'll have more options for where to hang out.


Wednesday, December 12, 2007

Pack'n

Update: 6 months after buying the XD, I have replaced it with a Walther PPS 9mm. Find out why in my folllow-up post.

I was raised around guns. As a kid I lived on a small hobby farm in a rural part of South Carolina, and guns were just part of the scenery. As young as 6 or 7, I was using small rifles and shotguns. Not that I was much of a hunter or anything. Mostly the guns were for defense.

We raised cattle, and packs of wild dogs were still common in the area, so we carried weapons of some kind almost all the time in the woods or pasture. Usually we carried knifes, BB guns, pellet guns, or sometimes just big sticks. But if a pack of dogs had been sighted, suspicious strangers reported in the area, or a violent crime where the assailant was still at-large, we'd switch up to shotguns and rifles.

At least twice, having a gun on me in the woods saved my life. Once from a group of dogs and once from a very upset bull.

I have a healthy respect for the 2nd amendment and I am comfortable with guns, but I've not bothered to keep my own guns as an adult. I had several reasons including the high cost of weapons, but mostly it was just because I usually had roommates who owned plenty of guns. I just never felt the need to buy my own.

I'm not the kind to trust the police and government to protect me from danger. Police are great 10 minutes after the shit hits the fan, but they aren't likely to be handy when you are actually being assaulted, robbed, mugged, car-jacked, etc.  And governments are just as likely to be doing the assaulting as protecting anyone from it. Likewise though, I do have a good grasp of the actual rarity of random personal assaults. Even in the most dangerous of urban areas. most assaults are committed by people you already know and have pissed off. So I just make a habit not to know too many people likely to assault me, and avoid pissing off the rest so much. This greatly increases my odds.

Still though, I have a habit of being out late at night and hanging around slightly dangerous places. I also own my own house now and live alone most of the time, so there aren't roommates with guns around anymore. And that is why  I finally decided to buy a handgun, learn to shoot reasonably well, and get my conceal and carry permit. I have always carried a good knife on me, but I have no doubt that a knife would probably do little for me against a real threat.

Time to upgrade.

One thing that always bothered me about civilian gun owners is that they tend to get carried away. I don't know exactly what it is that makes an ordinary person become a gun-nut, but It does seem to be a very simple conversion for most people. They decide they want to buy a gun, and two weeks later they are at the gun-show picking up their 8th pistol and 3rd rifle and have most of their credit cards maxed out on accessories already. They'll go on for hours on end debating the merits of this vs. that weapon, talking technical details about the performance characteristics of different kinds of ammo, and worst of all --they spend an enormous amount of money buying weapons, accessories, and gear that they will likely never use other than recreationally.

I'm not a person who has a need to be a gun-collector, and I've never wanted to be a suburban gun-nut. But in becoming a gun owner, I see more clearly just how very easy it is to fall into becoming one. Guns are not simple, especially handguns. They come in an amazing array of sizes, shapes and calibers. Just deciding what to buy requires a significant investment in time and learning. And no handgun is good for everything. So it becomes very easy to have a conversation like this with yourself:
"I'll get the 5" 9mm semi-automatic for the shooting range. It is very accurate, holds a lot of ammo, and won't beat the shit out of me. It'd have similar enough recoil to my concealment weapon to make it good for practice."

"Then I can get the 6" .22 semi-automatic for the range because ammo is dirt cheap for those, almost no recoil so you can fire it all day long, plus they are just plain fun."

"But I still need a concealment weapon. For that I'd like to get the 3" sub-compact semi-auto .45 with double stack magazine. Big bullets for best chance to put down an attacker in one shot, double stacked for more bullets, and still short enough to fit in a concealment holster... more-or-less."

"But in the summer when I'm wearing thinner clothes, it would be easier to conceal that tiny little 9mm with the single stack magazine. It is really tiny, but with smaller bullets it holds as many shots as the double stack .45 and is a lot lighter. Of Course it will kick like a mule and be less accurate, so it's not that good at the range".

"OK, maybe I should go with a longer weapon. A 4" barrel will help with accuracy, even though it's harder to conceal, but that's OK I can go with a smaller caliber instead".

"Crap, what about brand. This one has a manual safety which I like, but it is double action on that first shot, which always throws off my aim. Maybe I should get the one with the trigger and grip safeties instead... I don't like that as much, but it has a single action first pull... which would be better in a crisis.

"What about a revolver. They are more reliable and a little simpler to use."

"Now, should I get all metal, or go with a polymer frame?"

"Fuck-it, I'll just buy ALL of them!"
  
So I can see how collectors and gun-nuts happen.

For the record, I decided on a 3" Sub-Compact XD .40 Caliber from Springfield Armory. It's a short and squat weapon, but still a tad on the thick and heavy side. For most people it probably wouldn’t matter, but I'm so thin that concealing a weapon of any kind is a tad tricky. But this one conceals reasonably well when I wear a coat or heavy shirt, and most of the time I'll probably just be in my laptop bag anyway (I rarely go anywhere without that at my side).

I stubbornly refuse to go crazy and become a total gun nut and collect a personal arsenal. But also, I do live in a two story house so I'll probably pick up a second handgun for up-stairs and eventually a slimmer 9mm carry weapon for those times when the XD is just a tad too big or when I'll be carrying for longer periods of time. Add that to a 12 Gauge shotgun, the ultimate home-defense weapon, and that should be about all I'll need. It's Four weapons, but still... most of my friends own twice that many or more...

Shit... OK, I'll admit it... I'm now officially a gun-nut. Damnit! But I promise not to get too carried away.


Monday, December 10, 2007

I'm not your prozac

Caress has started blogging her experience with AVM over at "I'm not your prozac". She wrote a lot of those posts while she was going through her procedures, but since I hadn't gotten off my lazy ass and put up her blog site (and she was a tad too busy to do it herself) she's didn't get them online until more recently. Most of her posts are back-dated to when she actually wrote them.

I've avoided blogging about the experience mostly because I was waiting on her to blog about it first, and also because AVM is too damned scary for me to blog well.


Tuesday, November 20, 2007

Akamai MSDN Download Manager sucks!

 Richard's Braindump: Problems with the new MSDN Download Manager

I agree with Richard, the new MSDN download manager is worse than horrid. I've ran into every one of the problems reported but these in particular really chap my ass:
  • Close IE and the download manager closes. It asks you if you really want to close it, but it doesn't matter what you say... it will close anyway. When you re-open it, your download will be corrupt and you'll have to start over again from the beginning.
      
  • It'll just randomly tell you it can't download because of persistent network problems... and you can't resume because your file is corrupt.
     
  • If you have pop-up blocking on in IE 7 (which is the default) you might never get the download manager installed. It pops up a new browser window, which IE blocks, then the main page refreshes so fast you never see the notification that there was a blocked pop-up.
I have additional complaints too.
  • Adding additional files to the download manager sometimes (and by "sometimes" I mean "nearly always") closes the download manager... which as we discussed already, corrupts any files it was already downloading.
     
  • Manually pausing a download also usually corrupts it.
      
  • When it asks if you want to start the download all over again, and you say yes, it usually can't... so you have to remove the file from the manager, go back to the MSDN page then re-click the link.... which will close your download manger and corrupt the rest of your files!
And even if they fix these major technical problems, there are other issues I have with it:
  • Files are either downloading, or paused. There isn't a way to put them in any particular order in the queue and download them sequentially... so if you want to pull in 5 files, but you want 1 file immediately, then the rest to finish up whenever you have to pause all the downloads except the one, wait on it to finish, then manually resume the others.
     
  • There doesn't appear to be any mechanisms in place to allow you to throttle how much bandwidth it uses.
     
  • It allocates the disk space before the download starts. I HATE that. If you only have 1gig downloaded so far, it should not be taking up 4gig on my hard drive. When I'm low on disk space I can't start the download and have it coming down while I go and clean up some drive space. I have to clean up the space before I can start downloading. Minor issue, but still annoying.
Seriously, this thing is a total piece of shit... Why can't you just give me a damned link?

I have a download manager than works really good. I can control what files come down, in what order, how much bandwidth to use, and how many files to work on at a time (I use ReGet for those of you that may be curious).

Come on, just give me a regular old hyperlink or put it up on an FTP or something.

Considering that an MSDN Premium Team Suite subscription costs $10,000 up-front and around $3500/year for renewal fees,  it isn't too much to ask for is it?


Monday, November 5, 2007

Microsoft Money... still sucking...

The world of personal finance software is one of my least favorite. Anyone developing in this market has a tough job. People are idiots, and personal finance itself is especially designed to cleverly imitate real accounting yet completely fail to make any logical sense at all.

So I am never surprised at the crazy junk I see in personal finance software. The system it targets is moronic in the extreme, and the target user is assumed to equal that standard. 

But I'm a special idiot... one that uses Microsoft Money. I have tried Quicken, but I find their junk just a tad more annoying than MS Money but for different reasons. I also detest Qicken's license terms and marketing strategies. In contrast, Microsoft's lack of strategy in the personal finance market is quite refreshing. So I keep using MS Money even though I hate it.

The most annoying thing about MS Money is that they somehow makes it mandatory to upgrade to the new version every year. Somehow the old version just ends up "breaking" over time until you eventually get pissed enough to upgrade. I have no idea how they manage this, but eventually my old copy just sorta stops working right. Last time it was the fact that it wouldn't work after a patch for IE came out (and no fix available for the old version naturally). This time it just stopped being able to talk to my bank one day.

Every time I upgrade, I end up being amazed at how little was actually improved, or even changed from the previous version and equally amazed at just how many annoying "features" are still exactly as annoying as before. There always seem to be a couple of things I really like that manage to disappear from each version, and one or two new features show up that I don't care about at all. So each time I curse myself for forking over yet another $50 instead of just taking the time to put together my own spreadsheet in Excel... but I still I upgrade anyway.

Among my biggest complaints is the default "basic" checking account register. It completely lacks seriously important features, features I'd consider quite "basic"...  like the fact that it is missing the entire right-click menu and has no obvious alternatives for useful things like creating a reoccurring bill from an entry in the register, or marking transaction as reconciled, etc. It is also not obvious that you are even in a "basic" mode, nor that there an "advanced" mode that you could be using instead. Sure, there is a "Basic Register" icon, complete with text label at the top of the page, but it blends in so well that I never even noticed it until someone pointed it out to me. The "basic" mode just make the application feel like a piece of crap that doesn't do much of anything.

Fortunately, I do know about advanced mode, despite not having noticed the icon. I know this because I had used older versions where the advanced features were enabled by default. When I first upgraded to a version with the "basic" register and saw that those features had gone missing, I located how to enable them again via the online help. But the "advanced" register isn't exactly "advanced".  It just adds a few minor details to the list of transactions, turns on the right-click menu, and that's about it. But getting advanced mode enabled sure does requires an hell of an advanced level of skill. That icon at the top of the page isn't even clickable to toggle the mode! Instead, enabling the advanced register remains something that I have to look up in the online "help" feature each time because it is cleverly hidden deep within the not-so-fun "options" section of the program. Fortunately the online "help" is one of the few really well done parts of the application, and provides clear instructions for how to enable the missing functionality.

Then MS Money has the "basic" bills tracking system as the default too. In a misguided fit of "simplification", the developers of the default bills tracker helpfully allow you to choose to pay a bill using automatic payment (a.k.a. Bill Pay), electronic payment (e-pay), or by writing a check... But they thought that a "direct draft" bill was just too fucking complicated for the "basic" user, so that option isn't available unless you turn on the advanced bill tracking system. It's almost like they missed the fact that nearly every utility company on the planet pushes direct draft payment options like it was high-grade cocaine... Even people like me that prefer a push payment system over a pull payment system often end up with one or two drafts each month. In my case, some of my utilities actually give a discount if you agree to let them draft your payment. But, according to the crack MS Money development team I now know that only "advanced" users would need direct draft as a bill payment option. Thanks guys.

Then you have odd shit. Like the fact that, even in basic mode, MS Money encourages you to track your gross pay, taxes, withholdings, etc. from your paycheck. Most average joe home-users are obviously people that need to track the exact amount of their 401K deductions, but they would never need anything as advanced as fucking direct draft!

Anyway..

I find this "basic" vs. "advanced" thing to be a very obvious sign of poor UI design. After-all, it isn't as if there are THAT many "advanced" things you can do. Seems to be there should be just one account register, and just one bill tracking system. Then, by using good UI design, the software can helpfully arrange commands in a way that isn't confusing. The saddest thing is, the "advanced" register and bill tracking systems are actually quite easy to use once you figure out how to enable them. But most average joe home-users will never even learn that there is an advanced mode at all, much less get it turned on. Instead, most people would just think MS Money sucks and would not bother to use it... and, after asking around, this seems to be exactly what most people really do think.

Thursday, November 1, 2007

asp.net - Web Site vs. Web Application

For more info, see my follow-up article: ASP.NET: web site vs. web application project - Part 2 

A few years ago, when I first got a look at asp.net 2.0, it was clear to me that there must have been some serious changes inside the development team at Microsoft. ASP.NET 1.x was a huge leap forward in server side web application frameworks, but it was also geared to an object oriented audience. While it did support direct data access from web pages and everything-in-one-file styles, Visual Studio's use of asp.net encouraged a more tiered design with clean separation of presentation, business, and data access. It also encouraged OO  design by making inheritance, namespaces, and assemblies all obvious, simple, and the default way of things.

By the time asp.net 2.0 was coming out,  more agile languages were getting all the buzz. PHP was still relevant and Ruby on Rails was shaping up to be the next big contender. By that time Java's JSP, the only other heavy OO platform that still mattered, had fallen flat on its face. So it wasn't exactly a surprise that there was a focus within asp.net 2.0 towards the "less is more" philosophy of quick, dirty, do-what-I-mean design. Codeless databinding, master pages, skins, themes, and configuration driven providers for common stuff like membership, profiles, personalization, etc. were all expected.

But what make several of us step back was the new compilation model and the "project-less" web site... er... project. Aside from being one of the most impossible to name mechanisms, it was also a breaking change in how asp.net was done. The new compilation model threw out the visual studio project file itself, took asp.net back to the "compile-on-the-fly" concept, all but eliminated the use of namespaces within a web site, and radically altered the way UI template and the associated code-behind were arranged.

But for those of us doing serious web frameworks in large or complex environments, the web site model had some serious drawbacks. The biggest for me was having to manually deal with namespaces, which the UI and code designers would fight every step of the way (until you finally gave up and just let it pile up in one big default namespace). The loss of the visual studio project files was also painful, and it lead to an awkward "exclude from project" mechanism where the only way to get Visual Studio to ignore a file was to physically rename it. Large web sites with lots of code suffered horribly in performance because so many helpful visual studio features, like refactoring and the verification compiler had to sift through every file in the project with no guidance from project configuration.

Microsoft seemed to notice the problem, and resurrected the old project model in what they called the "Web Application" Project. This was essentially a retro-fitted clone of the old project model complete with designer generated code files (updated to take advantage of partial classes though). The new project type brought back namespaces, and once again encouraged clean OO design patterns. It also became apparent over a very short time that Microsoft itself would consider the web application model the new "enterprise" project type while the web site project type was more for the beginner and casual developers. For example, Team system's testing, build, and deployment work great with the web application project, but work poorly if at all with web site projects. But the new project type also brought with it the need to compile everything in advance, and used a different mechanism to associate UI and code-behind files that was incompatible with dynamic compilation.

With Visual Studio 2008 Microsoft is keeping both project types. But I think they are missing an opportunity. While the web application project and web site projects both have advantages and disadvantages, there doesn't seem to be a compelling reason for some of the features to be mutually exclusive.

I for one would love to see a hybrid project type. Keep the project file to organize the site and give us a place to put visual studio specific settings. But switch it use the dynamic compilation model from web site projects. Dynamic compilation and xcopy deployment were powerful ideas, and I can't see any reason they can't be used still. The verification compiler can be optimized quite a lot by having access to a project file... exclusions can be dealt with elegantly, and the whole thing can be sped up. For other code though, you could choose to continue to compile to an assembly in advance, or leverage the app_code folder for dynamic compilation, or a combination of both. And of course, keep the namespaces and configurable compiler options.

I'd love to see a web project type that keeps the best of web site and web application projects both. Give me control over the project's structure, namespaces, and compilation but also give me xcopy deployment, dynamic compilation (using my settings), and keep a consistent way of associating an asp.net UI file to its code-behind without requiring the designer generated code file and full compilation.

Oh well, maybe in the next release.

Monday, October 29, 2007

Weighted Obsolescence: Phone book season

Few things annoy me quite like having phone books delivered to my house. There was a time when the phone book was pretty important --a time when taking a decent sized ad out in the yellow pages was the difference between a successful small business and a fire-sale. But between the internet and 411, phone books are obsolete now.

There are still a small number of people, mostly older ones, that consider this whole internet thing to be a fad. They would still prefer a printed phone book... old habits. But for most people the phone book is just a huge waste of paper. And they must be terribly expensive. Someone has to spend large amounts of time composing the contents, doing the layouts, and overseeing a massive printing and distributing operation. The books have to be printed, and that's a LOT of printing. Then they have to be physically delivered, which must represent a truly staggering cost since each one is bulky, heavy, and has to be shipped individually to a different location. Massive costs, and massive waste. Wouldn't it be better to just charge a nominal fee to people that WANT a printed phone book, and just not bother the rest of us?

Despite the irrelevance of phone books though, there is an active war in the business between several different companies. Each of them has, for reasons that still elude me, decided that the phone company isn't qualified to maintain and distribute the directory of phone numbers. Instead, these motards decided that they could do a better job and that it is their moral imperative to bring competition to the phone book market!

So every year, I get two or three huge bricks of useless paper delivered to my door. Well, more precisely, delivered to some random part of my yard that I probably won't go into for at least four or five months after the phone book has been delivered to it... not actually delivered to the door you know. For reasons even more screwed up than the basic business plan, all of these companies decided that they had to deliver their brand of awesomeness at the same time of year.

So last week I got three massive, but irrelevant phone books littering my yard. Of course, by the time I noticed two of them, they had been sitting there collecting water and bugs for a few days making them even more massive. The water and bugs don't make the book any more useless though, that would be an impossibility given their very nature.

Its bad enough that I have to dispose of these relics of hate, but what REALLY chaps my ass is this... I don't have a phone line. In fact, there is not and NEVER has been a phone line at my address. So WHY do I get three damned phone directories that I wouldn't need even if I did have a phone?

You guys seriously need to talk to Maxxum. At least they put half-naked girls on the cover of the unsolicited, irrelevant crap they drop off at my house.

Saturday, October 13, 2007

Review - BlogEngine 1.2

I’ve put off reviewing BlogEngine until their development team finished up the 1.2 release. Up until now, I've been using incremental versions between the 1.1 and 1.2 release, but those were unstable at best. But now that 1.2 is out, and I’ve deployed two sites on it, I think its time for that review.

BlogEngine is an asp.net 2.0 application. As the name suggests  it is for people that want a blog site.

As a blog, BlogEngine is quite good. It provides the expected features: content postings (duh), comments, RSS/ATOM feed(s), archives, tagging, categorization, etc. It has a slew of extras like “gravatars”, content ratings, support for coComment, DZone, KickIt, and del.icio.us. And of course, BlogEngine takes care of pingbacks, trackbacks, custom tracking, endorsement (bLink), etc too.  The new version also has multi-author support, comment moderation, and several other nice touches.

In short, it does what a blog should do, and it does most of that very well.

Despite the name though, this is not really an “engine” in the classic software definition of the word. Part of the application could be described as an “engine” but it isn’t well designed for use within other applications. There are two source assemblies, one for the web site itself and the other is a class library. But the class library has some tight coupling and a reliance on the hosting asp.net web context.

The class-library contains the HttpHandlers and HttpModules that the web site uses to process the incoming requests. These don’t seem to really belong in the class library since they have questionable value to other web applications that might want to use the class library and would have no value at all in a non-web application. The class library also contains server controls that provide UI for the web site. Again, these have limited usefulness to other apps that might use the class library, and no use if the app isn’t a web application.

The most useful part of the engine would be the actual blog providers. These are responsible for building entity classes that represent the content of the blog, and persisting the content within a permanent data store. BlogEngine ships with an XML provider and a SQL Server provider. The providers could be useful to a wide variety of applications, but unfortunately there is still a heavy assumption of a web site context within the providers and the entity classes.  So, if you wanted to write a windows application to manage content within your blog, these providers and the entity classes will probably not work out too well for you without re-architecting to eliminate the reliance on web context.

Architecturally, I always have reservations when I see the provider design pattern being applied to an entire tier like this. Providers work great when applied to a specific and narrowly defined “service”, but providers used as whole tiers don’t scale very well, they overcomplicate things, and they can make maintenance or later extension much too difficult and cumbersome. Fortnatly BlogEngine's DAL is fairly small, so it doesn't suffer too much from these problems yet.

Having criticized the code, please let me say that the programmers that wrote this code are VERY good. This is not the work of armatures or idiots. The code is clean, neat, and organized. While I dislike how the code was architected, I have to acknowledge the development team’s skill and professionalism too. The code is very well written.

As for the web site itself, it follows a simplistic approach to skinning. The UI code is a mix of asp.net pages, master pages, user controls, and server controls. Quite a lot of the content displayed on the UI is generated by direct method calls with the output directly injected into the HTML. You’ll see a LOT more <%= GetSomeHtmlOutputAndPutItHere() %> type things within the pages than is typical in most asp.net apps. This simplicity gives a lot of freedom in skinning the application, but it also tends to get a little chaotic. The skins resembles php or classic asp apps a little more than I personally like.  You’ll have to use existing skins as a guide to build your own skins, and you wont get as much use from Visual Studio designers either.

On the plus side, the skinning is usually a matter of just editing a master page and two user controls within the theme folder. I’ve implemented two sites on BlogEngine now, and with both I found several things that required editing code outside the theme though. And some of those changes were in code within server controls from the class library assembly too. While this is fine with me, it does mean that the site ends of being hard-coded to the one theme I created. This isn't a problem, but it does indicate that the skinning appraoch could still use some work.

Another issue that I’ve not investigated is performance and scalability, but I suspect that this app would have trouble with large amounts of content. I suspect that the XML provider would suffer performance issues much sooner and more severely than the SQL provider would. But performance will be great for most blogs since they don't tend to get that large.

The verdict:
  • If you want a good personal blog, and want to host it on your own server or web hosting account, then BlogEngine is a good choice. It will provides all the functionality you’d need and a few good extras too. It will not be too difficult to create a custom theme for your site as long as you are good with HTML/CSS and are somewhat familiar with asp.net.
  • If you want code that you can use in your own application to help you with blog or blog-like functionality, then move along. Also, if you need multiple blogs hosted within one web site or have a high-volume blog, then BlogEngine is probably not for you.

Friday, August 17, 2007

Stupid Crusade - Block Firefox to protect your revenue!

Seems someone out there really thinks that a browser that blocks ads is theft. So they've outright blocked all Mozilla Firefox browsers and are advocating others do the same.
Software that blocks all advertisement is an infringement of the rights of web site owners and developers. Numerous web sites exist in order to provide quality content in exchange for displaying ads. Accessing the content while blocking the ads, therefore would be no less than stealing.
Their message gets richer too... so check out the entire thing:

http://whyfirefoxisblocked.com/

Never mind that ad-blocking is available for just about any browser, and most Mozilla users are savvy enough to work around being blocked... what boggles my mind is how these people think that blocking an entire browser will increase their ad revenue?

Of course, it seems clear that message is intended to drag the users into the politics of the issue, and is not an honest attempt to increase ad revenue. Sadly for them though, most users that see this message are going to be solidly on the other side of the argument... and pissed for being blocked!

Way to gain sympathy there guys!


GetOffMyCase!

One of those tired old arguments you see crop up again and again among programmers is that of case sensitivity in programming languages. The general trend currently seems to be that most are advocating a move towards case insensitive languages. These people tend to call case sensitivity a legacy of the “old” days. They then go on to cite that modern compilers have long been able to save us from the case sensitive buggy-man. Perversely the popularity of case sensitive languages seems to be rising while popularity of case insensitive ones is falling... and no, I have no empirical evidence of this, just something that seems true based on my own observations over many years in the field.

The arguments against case sensitivity range a bit, but one common complaint is that mistakes in case cause hard to detect errors or lead to having two different functions whose name differs only in capitalization. Another common complaint is that comparing strings or working with things that are strings in case sensitive languages often leads to bad behavior, especially when dealing with file paths or URLs where the underlying platform is not case sensitive but the language itself is.

Look, I’ll grant a couple of things:
  • Scripting languages or any language that makes heavy use of late binding should avoid being case sensitive. JavaScript in particular would be a LOT better if it were not case sensitive. In fact, JavaScript is what I like to call a case-destructive language.
     
  • String comparisons in ANY language should, by default, use case insensitive comparisons unless explicitly told otherwise. Most languages provide a mechanism to perform case sensitive or insensitive comparisons, but most C based languages default to doing a case sensitive match. Even C# (my language of choice) is guilty of this.
While there are a couple of points these advocates make that I can agree with, I disagree overall that case insensitivity is “better”. In case sensitive languages, after a little exposure, the use of case becomes a useful tool on its own.

Once you understand the conventions of the language you can usually tell a LOT about what is going on in code just by observing which case is being used.

For example, Identifiers in C# are camel cased when they are local variables, private fields or method parameters.  Pascal case is used for public members, class names, etc. The use of Case to distinguish meaning in C# is generally very consistent from one developer to another. The adherence to good naming conventions is partially due to the case sensitivity of the language itself, and is not a work around in spite of case sensitivity.

When I see something called myName in C# I KNOW I’m dealing with a local variable or private member. When I see MyName I know I’m dealing with a public member.

This is NOT confusing to me or most C# developers. On the contrary, it is  quite elegant and we get used to noticing the case of identifiers, and the case used tells us instantly something useful about the code.

But when I deal with VB code, I have to play guessing games when I see myName1 and myName2. Sure, conventions help some there too, but the conventions always feel like sloppy workarounds where the sole purpose is to support case insensitivity.

Of course, most of this relies on a decent understanding of the conventions of the language and the programmer being competent enough to follow those conventions. I’ve been working in C# for many years and I’ve never had serious problems with case induced errors. Sure, occasionally I have made a case related error, but I can’t say that this happens more often than the mistakes I’ve made in VB due to inconsistent and awkward naming. In both languages those kinds of mistakes usually get caught by the compiler or the IDE quickly.

Case sensitive languages have another side effect though, and in my mind this is THE most important thing. Case sensitivity makes programmers pay attention to detail. This may seem like a very simple thing, but attention to details is THE skill that I’ve discovered marks the difference between a decent programmer and a great one.


Wednesday, August 15, 2007

Local Coffee Shops Suck - Go Starbucks!

I have many friends and acquaintances that "hate Starbucks". Hating Starbucks might even be more popular than hating Paris Hilton.

They all have their pet complains, usually about some isolated unfair trade practices or injustice carried out by some mid-level area manager or something. The argument always trends to wrap up with how local coffee shops suffer when Starbucks comes to town (the Wal-Mart argument).

But the sad fact of it is that any company that size will have ass-hats that occasionally do bad things. I try not to hold that against a company unless that kind of behavior becomes the executive policy of the company as a whole (which is why I don't hate Wal-Mart or Starbucks but do hate Disney and Apple).

Anyway...

Let me tell you why Starbucks kicks ass... aside from the fact that they actually serve a decent enough cup of coffee.

The reason Starbucks kicks ass is because they understand this that you can't stir a 20oz cup of hot liquid with a fucking 4 inch tall plastic stick!

Starbucks has big sticks that fit down into the cup all the way to the bottom without burning your fingers! Out of dozens of local shops I've been to in dozens of towns, maybe one or two have had sticks that are appropriate for their larger size cups.

You'd think that if your fucking business IS coffee, then you'd take the time to notice that 4 inches of stick does NOT-the-fuck fit into 10 inches of cup!

How god-damned hard is it to understand ?!?!

See, Starbucks pays attention to those little details.

They provide this level of service in EVERY store.

So... I don't have to wonder if I'll have a way to stir my shit when I get to Starbucks because ALL of their shops have good sticks.

I don't have to wonder if I'll be stuck trying to open half a dozen tiny little packets of sugar (which is always messy) because Starbucks has both packets AND a free-pour canister of sugar on the counter (I know... almost unbelievable!)

That is why Starbucks is killing your local coffee shop.  


Friday, July 27, 2007

VS 2008 and .NET 3.5 Beta 2 Released

VS 2008 and .NET 3.5 Beta 2 Released - ScottGu's Blog

Been waiting on this for a while :) I haven't figured out if it shipped with a go-live license yet though.

I'm particularly eager to get a good look at the Team System stuff in 2008. I've been putting off testing of the beta team system stuff until it got more stable (deploying team server is a pain, even if you are using a virtual server).

Assuming this does have a go-live license, it will finally allow me to put together an update for the CCK project on the new platform too. More on that later...

Microsoft Robotics Studio

Microsoft Robotics Studio

Microsoft is a REALLY big company. Even for those of us that work in IT and make our livings off Microsoft's technologies exclusively, they still sometimes manage to slip products and technologies by us... but rarely does Microsoft have an entire development platform that manages to elude my notice.

I just came across this one. I wasn't even aware Microsoft was doing anything in robotics, much less actually had a development platform on the market.


Thursday, July 26, 2007

Reddnet now powered by BlogEngine.net

I've replaced the reddnet site with a new blogging application based on the BlogEngine code base.


Most of the scribbles section made it over to the new application, but the old posts' formatting remains a mess. I didn't move the old comments or ratings over though. I'll be fixing the formatting, tagging, and categorization of the posts over the next few weeks.


BlogEngine is a pretty neat community application. It is very simplistic in scope compared to the CSK code-base that reddnet previously used, but simplicity is a good thing since I have very limited time to manage my personal site. I will of course switch reddnet over to the CCK code-base once I've finished writing it :)


I'll be reviewing BlogEngine.net here in the near future.

Tuesday, March 6, 2007

Imaginary Usefulness

Few subjects frustrate me faster than mathematics. Math is an essential skill and a useful tool. But our school systems, secondary and primary both, totally fail to teach common sense along with it.

I’m constantly amazed at how many smart people are made stupid by an extensive education in Mathematics. Mostly this is because math is taught more as theory than applied science... especially in primary education. Through high-school most people learn a crap-ton of math theory and mechanics.  But there just aren’t many classes taught that apply math in a useful way. 

This leads to the question eventually asked in every math class, “when will I ever need to use this in the real world”. For higher math, anything above basic Algebra, the answer for most people is “probably never”.

That aside, the mechanics and theory still need to be taught so as to give students the possibility of going into a career where they could actually use higher math. A basic familiarity with more advanced math is useful to a sorta degree, but much more useful would be classes on the applied uses of math... classes that show how and when math is useful.  But what I’d like to see taught most is common sense.

For example, in my 10th grade Algebra II class we spent about three straight weeks going over imaginary numbers. Ok, let me get this straight... due to the practical limitations of the universe this math cannot be done. But if you ignore that little rule, you can still use the mechanics of our math system and generate numbers that have no meaning... they are imaginary.

Umm... excuse me, but shouldn’t that be the fucking end of the lesson?!?!?  But NOOOOoooo!  After that we had to spend weeks working math problems that result in imaginary numbers. Then we had tests on how accurately we could solve these problems.

So, we are going to be tested on our ability to come up with accurate answers to a math problem where both the question and the answer have absolutely no meaning at all?

Why can’t I just answer “turd-jam” for all the questions and still get credit? After all, “turd-jam” has just as much meaning in context with the questions as the actual numerical “answer” would... and in 10th grade it would have at least been funny, which is more useful than any other answer I might have derived. This is a classic example of the failure to apply common sense in math. As far as I can tell, the first major failure to teach common sense in math starts when they teach you the concept of negative numbers.  Numbers measure quantity. That’s fucking it... Quantity. The real usefulness of any math is in the question “quantity of what?” 

The only non-intuitive thing about quantity is the concept of zero, but this is fairly easy to explain to kids. Once zero is understood is becomes common sense. Zero also becomes the only non-positive number that remains meaningful. 

But negative numbers are NOT real. They cannot, and do not actually exist. If you solve a math problem, and the result is a negative number then all you have learned is that problem is flawed, you have incomplete information, and/or the number you produced is not a valid measure of anything that exists in the same context as the original question. Since the basis of the question was wrong, any number you derive no longer has any meaning and you should stop right there. If you continue with negative numbers then you are either just making shit up, or you are measuring something in a different context than the context assumed in the original question.... and you had better have a damned good idea what the new context of your answer is, or you are screwed! 

But they never teach that concept in school.

Let us use a practical example.
You own an appliance company. You have 1refrigerator in stock. A customer comes in and buys 4 refrigerators. How many refrigerators do you have left in stock?
If you answered negative three (-3), then just kill yourself now.

In-stock means “physically sitting in your warehouse”. If you walk back to the warehouse, I can only guarantee one thing... you, the fuck, will not see negative three refrigerators sitting back there.  The only meaningful answer to this question is “unknown”. How many refrigerators you have in stock after selling 4 of them depends on if you gave the customer the one fridge you had already, or if you are waiting to deliver all 4 fridges after your vendor deliveres them to you. The original problem contains insufficient information. When you walk back there, you will either see zero or one fridge in your warehouse... So when you see -3 in your answer, what it really means is that you are measuring the wrong thing. You don’t have -3 of anything. You may have +3 of something else, like refrigerators you need to order from a vendor, but that number is valid only in a different context, and only when the new context makes that number a positive number or zero. 

Which brings us back to imaginary numbers. How surprising is it that it is impossible to perform some mathematical operations when you are using numbers that simply mean you are working with values from a different context than the one your math problem represents? In the case of imaginary numbers you are working with numbers from a context that cannot exist at all, so you should just give the fuck up right there, rethink your math problem so that it addresses a real question in a meaningful context.

I see this junk all the time in programming. I get requests like, “I need a report that shows the number of brain cells left in my head”.

Sorry... but that report results in negative numbers and therefore has no useful meaning.

Why not ask a different question like “how many brain cells will I need to have implanted before I regain the ability to chew gum and walk at the same time?”.   Any math should answer a question in such a way that the answer has some useful meaning. If you are coding with negative numbers then your code measures the wrong thing. Flip something around so you are measuring things that exist. Your code will be easier to understand and will produce results that have more meaning.

With math, please always use common sense. Mechanics are neat, but you have to understand the “why” of your math, not just the “how”. That is especially true when you are making up math that gives answers that other people have to interpret.