Monday, December 15, 2014

Grading bike ride climbs

I’ve updated the route elevation page on doogal.co.uk so you can now grade climbs. It uses the same rating as Strava, which is based on the UCI climb categorization. Hopefully it will be useful if you can’t find a matching segment on Strava.

Saturday, December 06, 2014

Non-geographic postcodes added to UK postcode list

I received a complaint that some postcodes were missing from my UK postcode list that were in the original ONS data. I was a little confused by this, since the import is pretty much automated. But looking at my import code I realised I was ignoring postcodes with no location information. I think initially I’d assumed these were postcodes with duff data so ignored them, but for completeness I’ve now added them to the list, adding a few thousand more postcodes. Since they have no associated location data, there isn’t much interesting information available for each of these postcodes but they may be useful for something… An example.

Saturday, November 29, 2014

UK house sales October 2014

I’ve just finished uploading the latest Land Registry UK house sales data to my website. I then realised my annual percentage change calculation was wrong and have fixed that.

Wednesday, November 26, 2014

UK Postcode data November 2014

I’ve uploaded the latest UK postcode data to my site. There seem to be a few changes to the data (some codes for wards and districts have changed and LSOA data is now available for Scotland). I think I’ve found all the anomalies but let me know if you spot anything odd

Sunday, November 23, 2014

More options when calculating route elevations

I’ve added a couple of options to my page for calculating route elevations. You can now select options to avoid highways and toll roads so it should be even more useful!

Saturday, November 15, 2014

Goodbye Endomondo, hello Strava and VeloViewer

Back in the day I used to go for bike rides just for the fun of it. Then I discovered Endomondo and was able to analyse every part of a ride. And being a geek, that motivated me to ride more, try to ride faster and try to ride further. I could try to beat my best time for 10 miles or distance covered in an hour. But the number of awards was kind of limited, so my motivation started to flag. I got some more enthusiasm by training for RideLondon 100. But once I’d completed that (or to be more accurate, the 86 miles that I was allowed to ride), I was looking for something else to keep me moving, especially now the days are getting shorter and the weather is deteriorating.
I’d heard of Strava before but I’d discounted it because there was no app for my Windows Phone (what’s new right?). But then I realised Endomondo has the ability to export workouts which can then be imported into Strava. So I gave it a try. And I was immediately hooked. Rather than compare activity over a specific distance/time, I can compare efforts over the same bit of road, a segment. So for any ride I might pass through 20 or 30 segments and I generally beat a personal best on at least one of them. And that helps keep me going.
And for the ultimate geek experience, there’s VeloViewer. This runs on top of Strava and gives me an even more in depth view of my performance on segments. It tells me I have cycled on 1472 segments and I have a score of 86.4 (I have no idea if this is good or bad but it’s headed in an upwards direction which seems like a good thing)
VeloViewer was free but is now asking for £9.99 a year, with a pretty limited free version. This seems like a reasonable amount to pay for something so compelling. Strava also has a premium service but I haven’t seen anything in there that I need. My guess is that Strava will probably start to require premium membership to use their API fully which means any VeloViewer users will need to pay for their Strava account to use VeloViewer. I’d have thought that would be a no-brainer for Strava, since all VeloViewer users are the kind of people who are prepared to pay for a service and are therefore likely to cough up for Strava as well.
Anyway, that should keep me going during the long winter months. Maybe my next motivator will be my own little app running on top of Strava…

Friday, November 14, 2014

House price data now with annual change

The Land Registry sales data on my site now includes the annual price change. I thought this would smooth out the seasonal variations but the annual change is still quite volatile. Even so, it may be of interest.

Thursday, October 02, 2014

Unable to get property 'setState' of undefined or null reference in ckeditor.js

I’ve been working on the latest and greatest Business Optix product which dynamically creates lots and lots of HTML controls, some of which are CKEditor controls. Whilst testing the app and quickly navigating around it, I started getting an error “Unable to get property 'setState' of undefined or null reference” coming from deep in the bowels of CKEditor. Google astonishingly turned up nothing, but I remembered I’ve had trouble in the past with CKEditor when I didn’t explicitly destroy the editors. The fix was simple, like this

for (name in CKEDITOR.instances) {
  CKEDITOR.instances[name].destroy();
}

Friday, September 26, 2014

Property sales data for August 2014

I’ve uploaded the latest Land Registry data to my website. People comparing the data on my site to news stories like this may be wondering why my data shows a new all time average price, whereas the BBC say prices are yet to reach their previous peak of November 2007. I believe this is due to the Land Registry seasonally adjusting their figures whereas the average values on my site uses a geometric mean of the raw data.

Saturday, August 30, 2014

UK property sales data for July 2014

The latest Land Registry data for property sales is now uploaded to my site. Not much can be read into a single month’s figures, but the average sale price has reached an all time high, which I guess pleases a government seeking re-election soon and probably dismays first time buyers.

Wednesday, August 27, 2014

Tuesday, July 29, 2014

Friday, July 25, 2014

Setting HttpResponse.StatusDescription silently failing

We recently received a complaint from one of our customers. We provide some fairly simple reporting functionality, that allows more technical users to write their own SQL queries. The customer was building a report and when there was a problem with his SQL, sometimes he’d get a detailed error message, but other times he’d get nothing.

When we try to execute a query and an exception is thrown we catch the exception and write the error message to HttpResponse.StatusDescription so it can be displayed in the browser. This can fail if the message is longer than 512 characters long, but this is fairly obvious since setting the StatusDescription property will throw an exception. That clearly wasn’t the problem here.

I finally managed to reproduce the problem but was still confused about what was causing the issue. Then I spotted the difference between the working case and the non-working case. The non-working case contained new line characters. Thinking about it, this was fairly obviously going to be a problem. The HTTP response would not be valid, since the status description header would be split over two lines. But it would certainly be preferable if .NET threw an exception in this case, rather than just silently failing to set the status description.

So our code now looks something like this

        context.Response.StatusCode = 500;
        context.Response.TrySkipIisCustomErrors = true;
        string description = ex.Message.Replace('\n', ' ').Replace('\r', ' ');
        if (description.Length > 512)
          description = description.Substring(0, 512);

        context.Response.StatusDescription = description;
        context.Response.ContentType = "text/plain";
        context.Response.Write("An error occurred - " + ex.Message);

Sunday, June 29, 2014

Land Registry May 2014 data

I’ve uploaded the Land Registry data for May 2014 to doogal.co.uk. For all the talk of a house price bubble, the Land Registry data (arguably the most accurate of all the house price indices) doesn’t seem to show much movement at all over the last few months. Of course it’s a different story in London. There was a story about the housing insanity in Hackney a few months ago, and looking at the sales in East London does show prices shooting up over the last year.

Tuesday, May 27, 2014

ONS postcode data for May 2014 uploaded to doogal.co.uk

I’ve just uploaded the latest postcode data from the ONS to my site. There are over 2.5 million postcodes in there, alive and dead. My data checks suggest everything is in order, but let me know if you find a problem.

Monday, May 19, 2014

Downloading Javascript generated data

I have a number of web pages that generate some data in text areas using Javascript. The only way users could download this data was to copy and paste the contents of these text areas, but I wanted to add a download button to simplify the process. The problem is that this simply isn’t possible in Javascript. The only client-side solutions I’ve seen either require Flash or are not supported in all browsers.

So I came up with a slightly hacky and inefficient solution. The basic idea is to post the data to the server and get the server to return it to the client as a download. The HTML looks like this

      <form action="download.php" method="post">
        <div>
          <input type="hidden" name="fileName" value="locations.csv" />
          <input type="submit" value="Download" />
        </div>
        <textarea id="geocodedPostcodes" style="width:100%;" rows="20" name="data"></textarea>
      </form>

All that is needed is a hidden field that tells the server-side script what the download file name should be and a text area with a name of “data”.

The server-side script is pretty simple, it looks like this

<?php
  header('Content-Disposition: attachment; filename="' . $_POST["fileName"] . '"');

  // add data
  print($_POST["data"]);
?>

All it does is get the requested file name and echo back the data.

It’s seems a bit crazy (and a waste of bandwidth) that this seems to be the only way to achieve a seemingly simple task, but that looks to be the case. I’d be happy to be proved wrong.

Sunday, May 11, 2014

Help me go on a bike ride

Last year I saw the various Ride London rides on the telly and rolling through Kingston and fancied doing it myself. Riding round London and Surrey on traffic-free roads is very appealing, compared to the usual stop-start, take your life in your hands experience of cycling round these parts. So the first chance I had, I applied in the ballot for the Ride London-Surrey ballot. And in January I heard I’d missed out on getting a place.

There was one more option. Sign up with a charity and raise some money and get a guaranteed place. So I decided to try and help out Cancer Research. Why Cancer Research? Primarily because cancer affects so many people at all stages of life but also, on a personal level, one of my partner’s best friends lost her life to cancer a couple of years ago, before she reached the age of 40.

I’ve set up a page for donations, added a link from my website and been amazed by the number of people who don’t know me who’ve already donated. If you’ve found this blog or my website useful, or are just feeling generous, then please consider donating some money. I will certainly appreciate it, as will Cancer Research.

Wednesday, April 30, 2014

Land Registry March 2014 data uploaded

I’ve uploaded the Land Registry house price data for March 2014 to my website. Now that probably all the sales data for 2013 has come in, it’s plain to see sales volumes were up in 2013 and prices continue to drift upwards

Friday, April 18, 2014

The perils of micro-optimisations

A debate has been raging on my website over the use of StringBuilder.AppendFormat in my exception logger code. OK, raging is something of an exaggeration, there have been two comments in two years. But the point made by two people is that rather than

error.AppendLine("Application: " + Application.ProductName);

I should be using

error.AppendFormat("Application: {0}\n", Application.ProductName);

Since this means I wouldn’t be using string concatenation, which is considered bad for performance reasons. My main reason for not doing anything about this is because I’m lazy, but also because the whole point of this code is that it only runs when an exception is thrown, which hopefully is a pretty rare event, so performance is not a major concern.

But then I wondered what the difference in performance is between these two approaches? So I wrote a little test application that looks like this.

    static void Main(string[] args)
    {
      for (int j = 0; j < 10; j++)
      {
        // try using AppendLine
        Console.WriteLine("AppendLine");
        StringBuilder error = new StringBuilder();
        Stopwatch sw = new Stopwatch();
        sw.Start();
        for (int i = 0; i < 1000000; i++)
        {
          error.AppendLine("Application: " + Application.ProductName);
        }
        sw.Stop();
        Console.WriteLine(sw.ElapsedMilliseconds);

        // try using AppendFormat
        Console.WriteLine("AppendFormat");
        error.Clear();

        sw.Restart();
        for (int i = 0; i < 1000000; i++)
        {
          error.AppendFormat("Application: {0}\n", Application.ProductName);
        }
        sw.Stop();
        Console.WriteLine(sw.ElapsedMilliseconds);
      }

      Console.ReadKey();
    }

The results from this app in milliseconds are as follows (reformatted for clarity)

AppendLine 307 315 321 372 394 370 289 298 300 296
AppendFormat 366 360 362 471 353 359 354 365 365 350

So which is quicker? Well it looks like AppendLine might be marginally quicker. But, much more importantly, who the feck cares? We are repeating each operation 1 million times and the time to execute is still less than half a second. Maybe you can pick holes in my test application, but again I would ask who the feck cares? Either approach is really fast.

And this is the main problem with trying to optimise this kind of stuff. We can spend huge amounts of time figuring out if one approach is quicker than another, but a lot of the time is doesn’t matter. Either the code runs quick enough using any sensible approach, or it’s hit so infrequently that even a really poor implementation will work.

Of course we should consider performance whilst writing code, but we should only use particular approaches when we know they are going to produce more performant code. A good example is the StringBuilder class. We can be pretty sure this is going to be better than using string concatenation, otherwise it wouldn’t exist in the first place. That said, if you’re concatenating two strings I really wouldn’t worry about it.

But the key to writing efficient code is to understand what is slow on a computer. Network operations are slow. Disk access is slow. Because of that, anything that requires large amounts of memory (meaning virtual memory i.e. disk access) is slow. Twiddling bits in memory is quick. Fast code is achieved by avoiding the slow stuff and not worrying about the quick stuff.

And once you’ve written your code and found it doesn’t run ask quick as you’d hoped, don’t jump in and replace calls to AppendLine with calls to AppendFormat, profile your application! Every time I profile an application, I’m always amazed at the causes of the performance bottleneck, it’s rarely where I thought it would be.

If you don’t have a profiler, use poor man’s profiling. There are also free profilers available, I quite liked the Eqatec Profiler which seems to be available from various download sites, although it’s no longer available from Eqatec. But whatever you do, don’t get into Cargo Cult Programming

Saturday, March 29, 2014

Land Registry data for Feb 2014

I have uploaded the Land Registry house price data to doogal.co.uk.

There’s been a lot of talk in the press recently about there being a two speed housing market, London and the rest of the UK. You can see this illustrated fairly clearly if you first look at house prices in Blackburn then compare them with prices in West London.

Prices in Blackburn can be broken into three distinct periods. Prior to 2003, they were gently rising, probably in line with wage increases. Then in 2003 things went ballistic (I’m not sure of the trigger for that, although I’d guess it was easier access to mortgages). 5 years later in 2008, things ground to a halt, sales fell of a cliff and prices have been flat-lining ever since.

But look at West London and the only similarity is that sales volumes dropped off rapidly in 2008, but you’d never know that there was a financial crisis at all. Houses have been a one way bet for nearly 20 years. You’ve got to wonder how long it can go on.

Thursday, March 27, 2014

Bluffers Guide to responsive design

A while back I spent a bit of time making doogal.co.uk more mobile friendly. I’d put it off for a long time primarily because I thought it would be really hard. But actually it turned out to be not too tricky. So here is my not so comprehensive guide to making your site mobile friendly with responsive design

The first thing to do is decide at what screen size your design will change from the normal design to the mobile design. My decision was that tablets should see the standard design but mobile phones should see the mobile design. So the media query I use for all my mobile CSS is

@media handheld, only screen and (max-width: 600px), only screen and (max-device-width: 600px) 

Better menus - My standard menus were tricky to use on a small screen, but fortunately they were built using styled unordered lists i.e. <ul></ul>, so I was able to use the Mobilemenu JavaScript library. This converts the list into a dropdown <select> which is much more useable on mobile devices.

Hide stuff – display:none is your friend. Most websites have bits on the page that may be useful but aren’t entirely essential. On a big screen we can get away with that, but on mobile devices it’s necessary to concentrate on the essential information. So hide anything that isn’t needed. I have several tables with many columns, but each row links through to more information, so I hid several of the columns since the tables didn’t render very well. The best approach to this appears to be CSS like the following

.postcodeTable  td + td { display: none;} 

No more table layouts – We’ve been told for years not to use tables for layout of our websites. But I’ve certainly been guilty of it, since it’s always easier to build a multi-column layout using a table. But now is when the chickens come home to roost. You may want two columns on the desktop, but on a mobile device, you’ll probably want to have the two columns stacked on top of each other, so those tables will need to be converted to div’s which float on the desktop and don’t on mobile devices.

Server side control – The solutions so far have all been client-side. I think this is generally the easiest way to deal with the issue, but there is certainly a good argument for saying content shouldn’t be getting pushed to the client if the client will never actually display it. If you’re using PHP on the back-end, Mobile Detect Library can be used to tailor your HTML before it leaves the server. One place where you may need to do this is adverts. Google’s T&Cs say you can’t hide adverts, so using display:none for them is probably a bad idea.

Sunday, March 09, 2014

The beginning of the end of Metastorm BPM

It looks like development of Metastorm BPM has, if not stopped completely, at least slowed down. So I thought I’d write something about my thoughts about what was a big part of my professional career. If you want the full history, this isn’t it, have a look at Jerome’s book.

For me, it all started sometime in 1997. I was writing software for a firm called Bacon and Woodrow that sold actuarial software. It wasn’t really my kind of thing, but it was my first proper job, something to add to the CV. I got a call from a former colleague, Richard Kluczynski, who’d gone off to write his own software, then got a gig with a software house called Sysgenics. Before the days of mobile phones (or at least before I had one), I remember having to wander the streets of Epsom to find a phone box to call him back to discuss properly, away from the office. It sounded interesting but it wasn’t the right time for me as we were trying to get the first Windows version of our software out the door.

A few weeks or months later I got a call from a recruiter asking if I was looking for a new job and telling me about a company called Sysgenics. We were still trying to get our Windows version out the door, but I guess getting called about the same company twice peeked my interest. I remember looking at their website and getting pretty excited about the screenshots of some kind of graphical tool for building workflows called e-work. Before I knew it, I was in Wimbledon for an interview with Steve Brown and Jerome Pearce. And an hour later I was in the pub. This was obviously a great place to work!

And it was. We had no customers but some VC money to keep us ticking over. We were using the latest technologies (Delphi and MS Access!). Having no customers meant we could build stuff and break stuff without having to worry too much about upsetting people using the software, so we were always making a lot of progress.

Before long we were bought out by Metastorm. Looking back, that was actually a bit weird. Initially they seemed to be some massive software company but looking closer, the one product they sold, InForms, was clearly coming towards the end of its life, since it was tied into the dying Novell Groupware. But they had what we needed, money, and I guess we had what they needed, some modern software to sell.

The years flew by, six in fact. By that point we had quite a few customers, the little startup was a proper software house. There was structure and rules, forms to fill in, basically not really my scene anymore. So I flew the nest to work for a financial software house in central London. But a couple of years after that, Jerome asked me to join his little band of Metastorm consultants. So I built a shed and got to work building stuff on top of Metastorm e-work. Metastorm e-work became Metastorm BPM, we carried on calling it e-work… Metastorm started rewriting it from scratch in .NET, releasing it as version 9 (missing out version 8, some kind of off by one error I think).

I’d probably still be working in my shed for Jerome had the financial crisis not hit, caused major stress to our main client who then couldn’t pay us. So I went to Croydon, regretted it almost immediately and started working for myself. Back to the shed…

Then I started doing some work for Steve’s new company, Business Optix. That eventually became a full time job and is where I am now. Meanwhile Metastorm got bought out by OpenText. Given the price they paid, you’d think anyone who’d taken up their share options would have done well out of the deal, but you’d be wrong. Somebody must have made a nice chunk of money out of it, but it wasn’t the people who’d originally developed the software (this isn’t bitterness on my part, I never exercised the option on my shares).

But not content with one BPM tool, OpenText also bought Cordys and Global 360. I guess the writing was on the wall for two of those products at that point, why would a company want three BPM tools? Anyway, it looks like Metastorm BPM is one of the victims. You have to wonder why OpenText bought them in the first place, presumably not for the customer base, already fed up with having to rewrite their processes for version 9, now fuming that they need to rewrite again in some other system.

Saturday, March 01, 2014

Land Registry sales data uploaded to doogal.co.uk

I’ve uploaded the latest Land Registry sales data, covering sales for January 2014, to my website. The good or bad news, depending on your point of view, is that prices continue to creep up. 

Thursday, February 27, 2014

ONS postcode data for February 2014 available

I’ve just uploaded the latest postcode data from the ONS to my website. My checks suggests everything is OK with the data but let me know if you spot anything awry

Thursday, January 30, 2014

Land Registry December 2013 data on doogal.co.uk

I’ve imported the latest Land Registry to doogal.co.uk. As ever, little can be inferred from a single month of data but draw your own conclusions. And let me know if you’d like to see this data in some other way.

Tuesday, January 28, 2014

doogal.co.uk is more mobile friendly

I’ve been trying to ignore mobile devices for a long time. Although doogal.co.uk has always been available on mobile phones, so long as you were happy to squint and zoom in and out a lot, it’s never been particularly useable. I’d kind of hoped that as more people started browsing the web from their phones, the phones would get bigger screens and I’d get away without doing anything. Whilst some of the phones have got bigger, it seems a lot of people don’t want to talk into a phone the size of a paperback book. Understandable really, they don’t want to look like dicks.

On top of that, Google keeps telling me my site isn’t mobile friendly and it’s got to a point where a significant percentage of my visitors are using small devices to access the site, so I figured it was time to bite the bullet and fix it. So now if you visit from a smart phone, chances are you’ll see the more mobile friendly version of the site. In fact if you squidge your desktop browser to a small enough size, you’ll also see the mobile layout. Of course, just like the regular desktop site, it’s butt ugly but there is a reason why web design doesn’t appear on my CV!

But if you spot anything broken, please let me know in the comments or email.

Saturday, January 18, 2014

Improving the performance of MySql bulk inserts

When I was trying to improve the performance of my old server, I came across this page. Unfortunately it didn’t help with the old server and just made me realise I needed to upgrade my server hardware.

But once I got my new hardware, I wanted some tweaks to improve the performance of importing postcode and property sales data. Both were taking days to complete. One suggestion from that page was to use SET autocommit=0;. I was a little sceptical, but was willing to try anything to speed up the import.

So all I did was add the statement on the start of every insert and add a counter to my code. After 1000 inserts, I committed the changes, using the following

              if (count % 1000 == 0)
              {
                command.CommandText = "COMMIT";
                command.ExecuteNonQuery();
              }

And this made a huge difference to my imports. Whereas I was importing for days before, now the time taken was down to hours. Maybe I could improve it further by committing the changes even less frequently, but I’m pretty happy with the current situation.

So in conclusion, if you need your data imports to run quicker, start using SET autocommit=0; now!

Thursday, January 02, 2014

November 2013 Land Registry data

I’ve uploaded the latest Land Registry property sales data to my website, covering November 2013. As always, not a lot can be inferred from a single month’s data, but drilling down to individual postcode areas shows fairly different stories for prices and sales over the long term.