Sunday, January 31, 2010

David Cameron defaced

David Cameron defaced When the smug face of David Cameron started popping up on billboards all over the place, I had a very strong urge to deface it. It would appear I wasn’t the only one, although these are generally more amusing than what I had in mind.

http://www.mydavidcameron.com/

Saturday, January 30, 2010

Metastorm release BPM 9 and don’t tell anyone

Here’s an odd thing. When most software companies release a new version of their software, they shout about it until they are hoarse. But Metastorm have released version 9 of their BPM software and as far as I can see they haven’t even produced a press release to announce it to the world. Searching on Google News brings back no results and I can’t see any mention of it on the Metastorm website.

Initially I’d assumed this was because version 9 was released in the quiet period before Christmas and the PR onslaught would start in the new year, but here we are well into 2010 and there is still silence.

Now generally if it’s a choice between cock-up and conspiracy, I’ll plump for cock-up every time but I really can’t believe that any software company can forget to produce a press release to announce their new baby to the world so I have to go for the conspiracy option but what is the conspiracy? Answers on a postcode or in the comments…

Thursday, January 28, 2010

Buying steak and kidney puddings on the web

I was reminded of steak and kidney puddings some time ago when I read Stuart Maconie’s “Pies and Prejudice” and suddenly had a strong desire to once again experience the long forgotten taste of them. But my searches on the web to find somewhere to purchase them in this culinary wasteland called London led me nowhere. Then the other day I received an email from Holland's Pies and ventured back to their website. I then discovered they have recently opened their online shop with a selection of their pies available. So £18 and a couple of days later I am now in possession of ten pies (including several steak and kidney pudds)and am looking forward to sampling one for my lunch tomorrow…

Monday, January 25, 2010

Cost benefit analysis of energy efficient bulbs

We’ve had six GU10 bulbs in our kitchen for a while now and I’ve never really been very happy with them. They swallow a lot of power, don’t seem to last very long and are pretty expensive to replace. I got hold of an LED replacement some time ago and was less than impressed. It didn’t produce enough light, was really really expensive and then died after a couple of months. So when I popped into Maplin at the weekend, I thought I’d try out their low-energy GU10 replacement. Admittedly these are even more expensive than the full powered version, but they claim to last for 8000 hours and use a lot less power. I’m happy with the amount of light they throw out, although they suffer from the usual problem of taking a while to warm up. Anyway, I thought I’d do a quick calculation of how long it would take me to earn back the money I shelled out for them.

Cost per KWH – 0.15

Old 6 bulbs @ 50W = 300W

New 6 bulbs @ 11W = 66W

KWHs saved = 0.3-0.066 = 0.234 KWHs

Cost of bulbs = £7.49*6 = £45

Number of KWHs required to cover cost = 45/0.15 = 300KWHs

Number of hours required to cover cost = 300/0.234 = 1282 hours

Assuming 5 hours use a day = 1282/5 days = 256 days = 8.5 months

So based on this very rough calculation and assuming I’ve calculated it correctly, this would appear to be a good deal. Fingers crossed they actually last for as long as claimed, which I’m reasonably confident about based on my experiences with other energy efficient bulbs.

Monday, January 04, 2010

Getting all the points in a SqlGeometry

It’s quite simple to get hold of the points in a SQL Server geometry polygon using the STNumPoints and STPointN SQL functions, but this requires quite a few queries (or some kind of stored procedure). I tend not to like running lots of queries, even if that does smell like premature optimisation, and for my little project I don’t want to be adding stored procedures but then I realised the SqlGeometry type is implemented as a .NET type, so all its properties and methods are available from .NET code, so here’s a little extension method to get all the points for a polygon

public static class GeometryHelper
{
  public static SqlGeometry[] Points(this SqlGeometry geometry)
  {
    List<SqlGeometry> points = new List<SqlGeometry>();
    for (int i = 1; i <= geometry.STNumPoints(); i++)
    {
      points.Add(geometry.STPointN(i));
    }
    return points.ToArray();
  }
}

Sunday, January 03, 2010

Converting a SqlDouble to a .NET double

Note to self - this is simple, just typecast like so

(double)points[i].STX

Improving performance with Gzip compression in PHP

I was perusing through some of the data provided by Google Webmaster Tools when I came upon the Site Performance section. this told me that doogal.co.uk took longer to load than 96% of sites on the web, which was a little embarrassing. I knew what was causing at least some of this slowdown, the incomplete list of UK postcodes which has been getting gradually slower as the number of postcodes has got longer and longer. I’d already started to address that by paging the data, but the CSV data couldn’t really be paged, without reducing the value of it. So I had a look at the suggestion provided by Google, which was to use Gzip compression on the page. I’ve not really thought much about GZip compression before, assuming if it was so useful it would be on by default but I thought I’d give it a go anyway. So I fired up Fiddler and tried downloading my big CSV page and it took approximately 9.5 seconds to fully download. I arrived at this figure as an average after several reloads.

Next I added this line to my PHP source file, ob_start("ob_gzhandler");, and measured the difference. It now took 3.7 seconds to download, wow! A lot of gain for very little effort.

At this point, I wondered if I could add Gzip compression to all my pages. It turns out this is straightforward, just add the following to the .htaccess file

php_flag zlib.output_compression on

After adding this, the CSV page now arrived in 2.7 seconds. I’m not sure why this is even faster, but I’m not complaining. Now the rest of the site feels much snappier as well. So what am I missing? Is there a reason GZip compression isn’t on by default? Am I going to get bitten by this at some point in the future?