Wednesday, May 30, 2007

Unused features might sell the product

During a Scrum training at my company, there was an interesting discussion between a few people.

As you might know Scrum is based on the Lean thinking, which focuses on producing as little waste as possible. Our Scrum trainer presented a slide showing that 45% of the functionality in a product is never used, which makes it the equivalent of Lean's worst kind of waste: over-production. It should therefor -according to Lean thinking- be eliminated/avoided.

One of the attendees asked: "but what if those 45% contains features that helped sell the product?". Which is a valid question, especially in a market where the buyers of a product are often not the users of that product.

Unfortunately nobody knew a really good answer. The trainers remark that such a feature set was not "sustainable" was true, but irrelevant. Does anyone else have an answer? So what do you do with those features no one uses, but that effectively sell the product? Leave them in without maintenance? Just advertise them without properly implementing them? What do you do?

Monday, May 28, 2007

Catching up

I just returned from an 8 day holiday. And since this was a bicycle holiday in the center of France, I've been disconnected from the internet for over a week. So of course today I'm spending much of my day catching up with what's been going on. And the amount time it takes to catch up these days is starting to worry me.

It's amazing -to me at least- how much information I apparently consume during a normal week. There were about 300 emails at work and in private mail boxes. Granted, most of these were automatic emails from build systems, newsletters and the like. But even after scanning those, there are still about 60 emails that require my genuine attention. To some of you that might be peanuts, but to me it means another 5 to 6 hours to read and answer them. And that is excluding the time I've already spent this morning answering email from friends who were apparently unaware of my short holiday. Sorry guys!

But what I really start to notice is that these days most information reaches me through feeds (both RSS and Atom) via my trusted Google Reader. Although Reader refused to tell me anything more useful than the fact that I had 100+ unread items, a quick count shows that there were about 600 items in my list. That's a lot of news people!

Luckily I can quickly mark all items read in many feeds, like Slashdot and dZone. I normally quickly scan these feeds for interesting stories as they arrive, but I can easily skip a week. If something really interesting happened on any of those feeds, it'll probably show up on other feeds too.

But that still leaves me with feeds that I really value. Jeff Atwood of Coding Horror has been a busy guy and posted no less than five stories. I've already read a few, and they seem as interesting as ever. There is of course the weekly column by Cringely, which -although not always interesting to me- is a must read for anyone in tech. And after that, there's still TheServerSide, A List Apart, Ajaxian and many others. It looks like I'm fully booked for the rest of the day. People should really stop writing all this interesting stuff... at least while I'm on holiday.

Friday, May 18, 2007

Scalable game graphic

Over the weekend I downloaded the demo of Command & Conquer 3.


What's up with demo's these days?
Over 900Mb to play a tutorial and
two small missions. This is really
getting ridiculous.




I wasn't expecting to be able to really play the demo. My laptop is almost three years old you see, so it's really not up to todays gaming requirements. The graphics card isn't bad, but it's seriously lacking some memory. And so is the PC itself, because in these Gigabyte days the 512Mb of my laptop seem meager in comparison. Or so I thought...

My expectations were based on recent demo's I tried. The best of these did run on my system, but were simply unplayable. The worst was Supreme Commander, which didn't even want to start on my system. Come on people! I know this is probably the best game ever released, but does that really mean you should require a machine with at least a gig of memory just to have a look at it? Really?

With this experience, I wasn't expecting Command & Conquer 3 to do much better. But much to my surprise the game installed without a problem and even started up in an acceptable time. After the entertaining introduction movie I got to the menu and started the tutorial mission. And that too started up with an acceptable loading time. And more... it was actually quite playable. Admitted: the graphics didn't look very detailed, which at times made it hard to make out the separate units. But still... it was playable! On my three year old laptop!!!














Hardcore gamers might complain that Command &Conquer 3 is evolutionary instead of revolutionary. But for me it means that I can actually play the new Command & Conquer, compared to having to gather from screenshots and reviews whether Supreme Commander is worth buying. Especially if buying the Supreme Commander means also having to upgrade to a new machine!

Thursday, May 17, 2007

Resharper: C# background compilation

In a recent post Jeff Atwood complains about the lack of background compilation of C# code in the Visual Studio IDE. Coming from a Java background -where background compilation is pretty much standard in any IDE- it was indeed one of the first annoyances I noticed when I started doing C# projects. For a tool that has gone through so many iterations of improvements it is amazing how many of these annoyances are still left in Visual Studio these days.

Luckily the solution for most annoyances is easy and not even that expensive: get Resharper.


Resharper is a tool made by JetBrains that fills many of the gaps that you might find in Visual Studio. Gaps that are especially obvious if you come from a Java background and have used JetBrains' masterful Java IDE: IntelliJ IDEA.

IntelliJ IDEA is what JetBrains got famous with. And they've copied pretty much every feature over to Resharper. And then added some more.

IntelliJ IDEA is simply the most productive developer environment I have ever used, and that includes Borland's old tools that pretty much set the standard for me. Everything from simple auto-complete (like IntelliSense), smart auto-complete (which allows me to "type" complete lines of code by just pressing alt-enter a few times) and an extensive set of refactorings are all in IntelliJ IDEA by default.

Keep in mind: the version of IntelliJ I use almost daily at work is about four years old now, so probably IntelliJ IDEA is even better these days. We've just never gotten round to upgrading it. And since we're still stuck writing for Java 1.4 at work, it is not strictly needed to upgrade. But it just shows how good IntelliJ is: a four year old version of it still beats Visual Studio (at least as far as productivity is concerned) every day.

Monday, May 14, 2007

Succesfully implementing Scrum

At my company they're now introducing Scrum. Traditionally most of our projects were done using a sort-of waterfall approach. But in recent years it has become harder and harder to get product releases onto the market in a reasonable amount of time using this approach, so management has been looking for a solution.

I have actually lead some projects in the past where we already used an iterative development process. Some of these projects have been immensely successful, while others have failed to deliver the intended result in a reasonable amount of time. I am curious to see whether following the more standard -and better defined- Scrum framework will result in more reliable success.

When I recently watched a Google Tech Talk where Ken Schwaber explains some of Scrum, he mentioned that only about 35% of the companies that try to implement Scrum actually succeed on doing so. And while everyone -including Schwaber of course- is very positive about Scrum's advantages, I haven't seen many details about what can make it fail. And more importantly: how to avoid the dangers that might make it fail.

In other words: how do we avoid becoming part of the 65% that doesn't succeed in implementing Scrum? Does anyone know?

Saturday, May 12, 2007

No JavaOne

If you're a professional Java developer, you really can't have missed it: past week was the week of the annual JavaOne conference. And somehow I managed not to attend it once again. As a Java enthousiast since 1994 (the beta) and a Java professional since 1999, I often wonder why I never attended a single JavaOne conference.

It probably has to do with two things. First I'm not a big conference-go'er to begin with. I know conferences are a great way to get a glimpse at all kinds of technology and -more importantly- are about the best way to improve your network, but somehow I prefer learning a new technology by working with it. And networking... well, that's not really my thing.

The second reason I never attended JavaOne must be the distance. Coming from Europe, it's really quite a journey to attend a conference on the other side of the pond. It's not just the journey itself, but it's also all the arrangements you have to make. Including getting my employer to pay for the trip, because -let's face it- these things aren't cheap.

So once again I didn't get to see the keynote and the introduction of new cool Java-based technology. And once again I didn't get to see the winner in the T-shirt hurling contest. But then again, neither did anyone else because the T-shirt hurling was canceled this year.

Tuesday, May 8, 2007

Driving VMWare


I'm currently working on a project that consists of multiple subsystems. Integrating those subsystems is normally a matter of getting the installer for each and then running them on a machine that has been prepared to have all prerequisites. It's not really a lot of work, although it can be a bit cumbersome at times.

But since we build installers only once per week, we also only get one such integration build per week. And for some of the stuff I'm working with that just isn't good enough.

So I decided to do something about it and see if I could create a nightly integration build. Luckily we already run quite some unit tests every night and -as it happens- one of the teams is using VMWare as the basis of their unit tests in a way I hadn't seen before.

The team has a VMWare image with a snapshot just after they installed all the prerequisites on it. Every night, they revert the VMWare image to that snapshot and then install the software on it through a combination of NAnt files, batch files and custom made executables.

The key to installing the software without security issues is to have it run on the VMWare image itself, which is accomplished by using the wonderful psexec tool. Using psexec allows you to start a process on remote machine. Or a remote virtual machine in our case.

So the main NAnt file conists of this first instruction:

  • <exec program="vmrun" append="true" commandline="reverttosnapshot &quot;${vmware.image.file}"" />
Which reverts the VMWare image to its snapshot.

And then many of these instructions:
  • <exec program="psexec.exe" basedir="${pstools.dir}" commandline="\\${target.name} -u ${runas.username} -p ${runas.password} -i nant -logger:NAnt.Core.XmlLogger -f:"${host.name}\install_module_a.build" -D:database.type=mssql -D:database.cm.server=amssql2005 -D:database.cm.database=nightly_53_${target.name} -D:database.cm.username=dbuser53 -D:database.cm.password=dbuser53" />
I think you'll agree that this isn't the most readable NAnt instruction ever seen. But it basically starts NAnt on the remote (virtual) machine with a .build file that is on the (physical) host machine. All the -D arguments just tell the .build file what to do on which machine.

After a bunch of these psexec commands, we tell the virtual machine to reboot:
  • <exec program="psshutdown.exe" basedir="${pstools.dir}" commandline="-r -t 10 -f \\${target.name}" />
And voila, we have a fresh nightly build ready for testing.

I'm pretty sure we could also have accomplished that last step through some of VMWare's tools, but -like pretty much all sysinternals tools- psshutdown just happens to be part of our standard tool belt.

Now all that is left to do is run an automatic sysprep on the fresh build, so that multiple people can start it without getting into each others way.

Friday, May 4, 2007

My simple spam monitor

GMail has a very good spam filter. So good in fact, that I switched over my personal email domain to Google's hosted mail solution just to benefit from their spam filter.

GMail keeps mail marked as spam around for 30 days, so you can occasionally check it for "false positives" (emails that were marked as spam, but are in fact not spam).

I used to check the spam box every day and then immediately emptied it. One reason to do so, was the fact that you're greeted by a cute "Hooray, no spam here!" message if there are no spam messages.

Nowadays I hardly ever check the spam box, because the filter has proven itself quite reliable. But since I stopped checking the spam for false positives, I also never clean my spam box anymore. The spam keeps accumulating until GMail throws it away automatically after 30 days.

As it turns out, this box is now a very handy spam monitor. It allows me to quickly see the spam volume for the last 30 days. A few weeks ago it reached a high of just over 1200 messages and currently it's at about 960. So spam is on the fall. Unfortunately most likely it will rise from its ashes in a few weeks. I will know it when that happens, but thanks to the GMail spam filter I will not be bothered by it.