Thursday, December 31, 2015

UK house prices November 2015

You can now view and download UK house price data for November 2015 from my website. Not much to report, prices continue on their ever upward trajectory, transactions appear to be creeping upwards

Tuesday, November 24, 2015

Postcode data for November 2015

I’ve uploaded the latest ONS postcode data for November 2015 to my website, all 2,554,806 of them. I’ve run my usual checks but let me know if you spot anything that looks incorrect.

Sunday, November 22, 2015

Fixing Strava elevation data

For some reason, Strava actually trust the data that comes from their users. More specifically they use the elevation data from the user’s ride when the user creates a segment. From a technical point of view, this is definitely the easiest thing to do, but unfortunately GPS devices do occasionally lose their mind, so the data can be a mess. This can lead to garbage segment data, like this. A glance at the elevation profile makes it obvious that something is amiss. This dodgy data then means any derived data is also dubious, such as the climb category and the VAM numbers. The KOM rider on this particular segment has a VAM of 9,992 which is over 5 times what a drugged up Lance Armstrong could achieve. Even my average VAM on category 4 climbs is over 1,000 which suggests I could make a good fist of keeping up with a bunch of professional cyclists. Which I couldn’t. Ever.
In an ideal world, Strava would fix up these dodgy segments in some way. One fix would be to average out all the elevation data from every rider who has ridden a segment. Alternatively, they could use the elevation data from one of the mapping services. Finally, they could make it easier to report bad data.
So whilst we wait for Strava to fix this issue, I thought I’d have a play with the second option. My Strava segment search tool now has the ability to view segments as well as view them on Strava. This is what the example segment looks like. It use Google Maps to calculate the elevation of the segment and adds that to the elevation profile, along with calculated statistics.

Monday, November 09, 2015

UK stations data

As a prelude to some other work I might one day get round to, I’ve uploaded a list of UK train stations to my website. It comes in CSV and KML flavours, with the KML highlighting the busiest stations (mostly in the South East, as if you need to ask).

Thursday, October 29, 2015

UK property sale data for September 2015

It’s the end of the month so it’s time to upload the latest Land Registry property data to my website. The data crunching is still in progress but I’m off on holiday for a few days so can’t wait for it to complete before posting here (calculating all the various averages can take quite some time). Predictably enough, the data shows house prices continuing their upward march.

One sale this month caught my eye. Flat 4, 19 Terrapin Road was the first flat we bought, back in 1999. It’s just changed hands again, for a cool £530,000. So in 16 years, the price has increased over fivefold… Just one example of the insanity of the London housing market.

Tuesday, September 29, 2015

UK house prices August 2015

I’ve uploaded the latest house sale data from the Land Registry to my website. Prices seems to be ticking up at an increasing rate, numbers of sales are not changing much. One would imagine without an increase in volumes, prices can’t remain at their current high level. but I could have said the same thing for the past 7 years…

Saturday, August 29, 2015

UK Property Sales July 2015

I have uploaded the latest Land Registry house price data for Jul 2015 to my site. Prices continue to rise moderately and sales continue to be at a low level

Wednesday, August 26, 2015

August 2015 UK postcode data

I’ve uploaded the latest UK postcode data to my website. It now contains 2,551,959 postcodes, including live and terminated postcodes. I’ve run sanity checks on the data and all appears well but let me know if you spot any problems.

Friday, August 07, 2015

How to top a Strava segment leaderboard

Strava is all about the segments and bragging rights are gained by being top of the leaderboard for a segment. But for those of us living in areas with many other cyclists, we are very unlikely to be fast enough to top most of the local segments, even if we get pushed along by a massive tailwind. Here’s an example near me, with nearly 50,000 riders attempting it over half a million times. The leaderboard contains a number of professional cyclists, since a number of races have passed through, so I’m never going to get anywhere near the top (since you asked, I’m at about 10,000 currently).
But we all want our own KOM/QOM, so what to do?

Find an obscure segment

Head over to my Strava segment search tool and zoom in and pan around a bit. You should see quite a few more segments than you’ll find with Strava’s own search. They’ll generally be less popular segments and hence more likely to have beatable times. 

Create your own obscure segment

I have a couple of KOMs for my rides to and from work. These are fairly meaningless KOMs since I’m the only person to ride one of the segments and the other one has only been ridden by one other rider. But if that’s enough to make you feel you’ve made it as a rider, then go ahead and create your own segment. For this to work, you need to decrease the chance of anyone else riding your segment, so stick to obscure roads, make the segment fairly long and choose a route that nobody would ever normally follow. This is a brilliant example, one day I’ll get round to riding it to see if I can top the leaderboard

Keep riding

A while back I headed out on a ride, going down some roads I haven’t explored before. On getting home I discovered I’d topped a leaderboard without even trying. Admittedly only 10 other people have ridden the segment but it still counts!


There are websites that will take the output from your bike computer and shift it around so it appears you went faster than you did. No, really. It’s obvious chasing after Strava KOMs is a fairly pointless activity, but cheating to do it has to be the most ridiculous thing ever.
But what about inadvertently cheating? Whilst on a ride, my GPS went a bit haywire and for a few minutes I was at the top of a leaderboard. I guess the algorithms at Strava spotted the mistake (or the former holder of the KOM) and I was demoted pretty quickly. But what about this one? Everybody on the first page of that leaderboard is averaging over 78mph, which is very impressive for a hilly segment round Richmond Park. But if you look at the actual rides for those amazing times, none of them bear any relation to the actual segment, they are just in the same general area. Figure out how that bug works and you could be topping lots of leaderboards.

Thursday, July 30, 2015

Monday, July 13, 2015

Strava segment explorer

I’ve added a new page to my website that gives a new look at Strava segments. I’m not too keen on the segment explorer on the Strava site. It doesn’t let me zoom in fully so I can’t find obscure segments that a slow coach like me might have a chance of gaining a KOM. It also only ever displays 10 segments at a time so I can’t get a good feel of what’s around.
My explorer fixes these issues. I have a load of ideas on how to improve it, so let me know if you find it useful and what you’d like to see.

Saturday, June 27, 2015

UK property sales May 2015

I’ve uploaded the latest Land Registry house price data to my site. Prices are still slowly ticking up but it seems the mini-boom in sales of 2013 and 2014 is over. 

Wednesday, June 03, 2015

UK postcode data for May 2015

I have uploaded the latest ONS postcode data to my website. This means all the latest postcodes in England, Wales and Scotland are now available. Northern Ireland continues to go its own way with its licensing so I am unable to update the postcodes I have from 2008. If you’d like to see up to date Northern Irish postcodes, I suggest contacting NISRA and ask them to relax their licensing to be in line with the rest of the UK.

Tuesday, June 02, 2015

Saturday, May 30, 2015

Retrieving the most popular pages using Google Analytics API again…

I occasionally run a little app I wrote 4 years ago to grab the most popular pages on my website via Google Analytics and update the database so the website can display a top 10 list of those pages. I tried to run it this morning and it fell over in a heap. It seems that Google no longer supports the API I was using. Ho hum, shit happens, software does rust…

So time to drag out my old code. Except I couldn’t find it. So time to look at Google’s latest and greatest API and rebuild it from scratch. Here’s what’s required and a small code sample.

First download the Google API .NET libraries. There seem to be a whole host of Google API libraries in nuget, but for Google Analytics the following should get what you need and all the dependencies.

Install-Package Google.Apis.Analytics.v3

Then you’ll need to create a service account in the Google Developers Console. After creating this service account, create a P12 key for it and save it somewhere on your computer. Then add the service account email address to the Google Analytics account you want to access.

Next fire up Visual Studio and create a console application and add the following code

using Google.Apis.Analytics.v3;
using Google.Apis.Analytics.v3.Data;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Services;
using System;
using System.Security.Cryptography.X509Certificates;

namespace UpdateTop10
  class Program
    static void Main(string[] args)
      catch (Exception ex)
        Console.WriteLine("ERROR: " + ex.Message);
      Console.WriteLine("Press any key to continue...");

    private static void Read()
      String serviceAccountEmail = "<the service account email address>";

      var certificate = new X509Certificate2(@"<the location of your P12 key file>",
        "notasecret", X509KeyStorageFlags.Exportable);

      ServiceAccountCredential credential = new ServiceAccountCredential(
         new ServiceAccountCredential.Initializer(serviceAccountEmail)
           Scopes = new[] { AnalyticsService.Scope.AnalyticsReadonly }

      // Create the service.
      var service = new AnalyticsService(new BaseClientService.Initializer()
        HttpClientInitializer = credential,
        ApplicationName = "<your application name in Developer Console>"

      // Run the request.
      DataResource.GaResource.GetRequest req = service.Data.Ga.Get("ga:<the ID of the analytics view found under Admin/View Settings>", DateTime.Now.AddMonths(-1).ToString("yyyy-MM-dd"),
        DateTime.Now.ToString("yyyy-MM-dd"), "ga:visits");
      req.Dimensions = "ga:pagePath";
      req.Sort = "-ga:visits";
      GaData data = req.Execute();

      for (int i=0; i<20; i++)
        Console.WriteLine(data.Rows[i][0] + ": " + data.Rows[i][1]);

Fill in the bits between angled brackets with your details and give it a go. The top 20 visited pages from the last month should appear in your output.

Overall it wasn’t too painful, Some of the examples on the web seem to be written for older versions of the API which can cause some confusion and Google have such a huge number of APIs out there, finding the right one can be tricky, but once those hurdles are overcome, it’s reasonably straightforward.

Update - Windows Live Writer currently doesn't work with Blogger accounts, I'm guessing this is the same issue that I was having with my old little app. Hopefully Google told everyone they were turning off ClientLogin support, but it appears Microsoft didn't get the memo... And hopefully it gets fixed soon, because the Blogger editor is effing terrible

Monday, May 18, 2015

UK constituency and administrative area KML

A strange thing happened last week, visits to my site were affected by real world events. That’s the first time that has happened. The cause was the UK election and the pages affected by it were the pages devoted to UK electoral constituencies and in particular the constituency of South Thanet. After having a look at those pages, I came to the conclusion that any visitors may have been quite disappointed with the data available on those pages. So I’ve spent a bit of time improving them. They now include the area polygons of each constituency and also provide an estimate of the population and number of households in each constituency.

Whilst i was at it, I also added area polygons for administrative districts and wards. Enjoy!

Thursday, May 14, 2015

No trouble with counties

I wrote recently about the trouble I’ve had figuring out what to do about counties in the postcode data on my site. Thanks to a very helpful comment on that post, I finally figured what I hope is the correct course of action. Previously I only showed county information for postcodes that were located in a county council. The suggestion was to map postcodes to ceremonial counties, which I wasn’t really aware of before. But a quick look at Wikipedia suggested they was a pretty simple mapping between them and administrative areas, with the exception of a little complexity in Stockton-on-Tees. So that’s what I’ve done. You can now download postcode data for each English county here. Hopefully this meets all your county needs!  

Wednesday, May 06, 2015

Distance to sea, house price improvements

I made a few improvements to the website over the Bank Holiday weekend. Ever had a burning desire to know the distance from a postcode to the sea? Well now you can find out.

I’ve also updated the way individual house prices are listed so that all sales for a particular property are grouped together (example)

Saturday, May 02, 2015

UK house price data March 2015

I’ve uploaded the latest Land Registry house price data for March 2015 to my site. Prices continue to chug along at 4-5%, as they have been doing for the last 12 months. The CSV download data now includes a yearly summary and I’ve included charts for this annual data at the postcode district level (monthly data is too volatile to show anything meaningful in such small areas)

The trouble with counties

I include some county information with my UK postcode download data, but I get quite a few questions regarding it. The most common question is why don’t all the postcodes have an associated county. The answer is that county information is only shown for postcodes that are located in an administrative county council. So LA1 postcodes are listed as part of Lancashire but BB1 postcodes aren’t, even though most people would consider Blackburn to be in Lancashire. This page gives you an idea of how this works. Select ‘Counties’ in the dropdown and see all the gaps.

The second question that generally follows is whether I could include county information for each postcode. This is where things get tricky. Have a read of this Wikipedia page on the subject of counties. In short, there are the administrative county councils in use today, there are historical counties whose boundaries have changed many times and there are postal counties that used to be supplied by the Royal Mail.

So if I wanted to add county information for every postcode, my first decision would be which of these to use. The Royal Mail seem pretty keen to get rid of postal counties and the information is not provided with the freely available postcode data, so that’s not an option.

So another option would be to use historical counties. Leaving aside the fact these boundaries changed many times, I’m not sure using them would satisfy most users of my download data anyway. Taking an example from the referenced Wikipedia page, how many people would consider Brixton to be in Surrey?

And the final option would be to fill the county data from the local authority. So Brixton would now be in London, but BB1 would now be in ‘Blackburn with Darwen’. Now in this case it’s obvious that any postcodes in the ‘Blackburn with Darwen’ area should show Lancashire as the county so I could possibly map all these authorities to sensible counties, with some work. But even then, would this provide people with what they want? Kingston would be in London, although some people would expect it to be in Surrey, since Surrey county council is based there (and use Surrey in their address).

So in conclusion, counties are blooming tricky and I suspect no one size would fit all, hence the incomplete set of data on my site. If anyone has an opinion on whether there is a good approach to this, let me know.

Monday, March 30, 2015

Cycling halfway round the world

In July 2012 I started to use Endomondo to track my bike rides. Since then some things have changed in my cycling, I’ve moved from a mountain bike to a road bike, I’ve tried cycling 100 miles in one day (only to be thwarted by the weather)and I’ve started using Strava, but I’ve continued to use Endomondo. And today I reached the milestone of getting halfway round the world.


Last year I covered almost 6,000 miles. If I can continue at that level, then in a just over a couple of years I will have completed my virtual trip round the world. If you want to encourage me, then you can sponsor me in this year’s Ride London 100

Saturday, March 28, 2015

Loading Google Maps asynchronously

Google’s PageSpeed keeps telling me I should load scripts asynchronously to improve the performance of my website. Now in an ideal world I’d use something like RequireJS to implement this for all my scripts, but frankly that seems like a bit of a big task and more than likely I’d stuff it up and break large chunks of my web site. So I thought I’d start small and just load up Google Maps asynchronously. Google provide an example of how to do this, but I wanted to encapsulate that in a simple reusable function with a callback function parameter to run after the library had loaded. This is what I came up with in TypeScript.

function loadGoogleMaps(libraries: string, callback: () => void) {

  window["initGM"] = () => {

  var script = document.createElement("script");
  script.type = "text/javascript";
  var url: string = "";
  if (libraries != null && libraries !== "") {
    url += "&libraries=" + libraries;
  script.src = url;

UK house price data February 2015

I’ve uploaded the latest Land Registry data to my site. Prices continue on their seemingly never ending upward march.

Thursday, March 05, 2015

A cross browser XML parser

A post from three years ago detailing how to implement selectSingleNode for XML documents in a cross-browser friendly manner is still getting a good number of hits. Which I guess shows that developers still need to manipulate XML in browsers, even with the increasing popularity of JSON. When we started to rewrite our desktop app on the web, JSON seemed like the obvious choice, but we use XPath in a big way and we wanted our web app to be compatible with our desktop app, so we stuck with XML, which meant having to deal with the different ways XML is supported in different browsers. So we now have a reasonably well featured cross browser XML parser, the source of which you’ll find below.

A couple of things to note, this probably works in IE8 and below but I’ve never tested it since our app needs at least IE9. Also, the code is TypeScript rather than Javascript, since TypeScript is slightly less insane…

// stop TypeScript complaining about stuff we don't have definitions for
interface Window {
declare var XPathResult;

class XmlWrapper {
  private xmlDoc: any;
  constructor(xml: string) {
    try {
      // try Internet Explorer first. Although later versions have DOMParser, they don't implement evaluate
      this.xmlDoc = new ActiveXObject("Microsoft.XMLDOM");
      this.xmlDoc.async = false;
      this.xmlDoc.setProperty("SelectionLanguage", "XPath");
    } catch (ex) {
      if (window.DOMParser) {
        var parser = new DOMParser();
        this.xmlDoc = parser.parseFromString(xml, "text/xml");
      } else {
        throw new Error("Can't find an XML parser!");

  public selectSingleElement(xmlNode, elementPath: string): Element {
    return <Element>this.selectSingleNode(xmlNode, elementPath);

  public selectSingleNode(xmlNode, elementPath: string): Node {
    if (xmlNode == null) {
      xmlNode = this.xmlDoc;

    if (this.xmlDoc.evaluate) {
      var doc = xmlNode.ownerDocument;
      if (doc == null) {
        doc = xmlNode;
      var nodes = doc.evaluate(elementPath, xmlNode, null, XPathResult.ANY_TYPE, null);
      var results = nodes.iterateNext();
      return results;
    } else {
      return xmlNode.selectSingleNode(elementPath);

  public selectElements(xmlNode, elementPath: string): Element[] {
    return <Element[]>this.selectNodes(xmlNode, elementPath);

  public selectNodes(xmlNode, elementPath: string): Node[] {
    if (xmlNode == null) {
      xmlNode = this.xmlDoc;

    if (this.xmlDoc.evaluate) {
      var doc = xmlNode.ownerDocument;
      if (doc == null) {
        doc = xmlNode;

      var resultsArray = [];
      var results = doc.evaluate(elementPath, xmlNode, null, XPathResult.ANY_TYPE, null);
      var thisElement = results.iterateNext();
      while (thisElement) {
        thisElement = results.iterateNext();

      return resultsArray;
    } else {
      return xmlNode.selectNodes(elementPath);

  get documentElement(): Element {
    return <Element>(this.xmlDoc.documentElement);

  public xml(): string {
    // for IE
    if (this.xmlDoc.documentElement.xml) {
      return this.xmlDoc.documentElement.xml;

    // Chrome, FireFox
    if (this.xmlDoc.documentElement.outerHTML) {
      return this.xmlDoc.documentElement.outerHTML;

    // Safari
    return (new XMLSerializer()).serializeToString(this.xmlDoc.documentElement);

  public static getNodeText(node: Node): string {
    if (node == null) {
      return "";

    var value: string = node.text;
    if (node.textContent) {
      value = node.textContent;

    return value;

  public static setNodeText(node: Node, value: string): void {
    if (node == null) {

    if (node.text !== undefined) {
      node.text = value;

    if (node.textContent !== undefined) {
      node.textContent = value;

Sunday, March 01, 2015

Google Earth Pro is kind of free

I was excited when Google announced Google Earth Pro was going to become free. A lot of geographical data comes in Shape files, but these aren’t usable in any free applications I know of and can’t really be used on the web without converting to something like KML first. But Google Earth Pro can load up Shape files and convert them to KML/KMZ.

Downloading isn’t a problem but getting hold of a key doesn’t seem to work as far as I can tell. Following the link to grab a free key just keeps redirecting to the download page. So I gave up in the end and grabbed an illicit key instead. Not sure of the legality of that, but if it’s meant to be free and I still need a key and Google won’t give me one, what choice did I have? Anyway a search for “google earth pro serial key or number” gave me a link to a site that produced a key I could use.

Saturday, February 28, 2015

UK house price data January 2015

I have uploaded the latest Land Registry house price data to my site. Prices continue their gradual ascent.

The BBC reports the blindingly obvious that there is a wide gap in regional house prices. This is clearly visible if you compare my place of birth with my current abode. Blackburn, like many areas outside the South East, has seen pretty much static house prices since 2008. Kingston upon Thames, like most other areas in London, took a bit of a breather in 2008 and then continued on its upward trajectory. I guess the interesting question is whether this is a permanent change or if the differences in regional prices will revert back to their historical average at some point. I can’t pretend to know the answer to that, but I do know prices in London are insane…

Wednesday, February 25, 2015

UK Postcode data for February 2015

The ONS have released the latest version of their postcode dataset and I have now uploaded it to my website. A few sanity checks suggest it is OK but let me know if you spot anything strange

Sunday, February 08, 2015

How to beat a Strava PR on every ride

I think getting to, or maintaining, a healthy weight is fairly straightforward, in principle at least. Match your calorie intake with your calorie burning. Riders on the Tour de France eat 9000 calories a day, but don’t put on any weight for the obvious reason that they burn through all those calories. Eating less has never appealed to me, so the exercise side of the equation is the one I try to work on and it generally works OK for me.

But motivation can be a problem. Cycling is my exercise of choice and a winter of cold, wet and windy weather can rather reduce the will to get out on the road. I’ve found a few things to motivate me in the past, signing up with Endomondo (and trying to beat various personal bests), buying a new bike and training for the Ride London 100 to name a few.

Strava is the latest motivator. Every ride gives the opportunity to beat a PR on one segment or another, so there are many more chances to get a little boost from receiving a medal at the end of a ride. But sometimes, nothing. A ride of an hour may lead to no achievement.

But there is a way to almost guarantee a PR on every ride. First, sign up with veloviewer. This provides even more geeky information about every segment you’ve ridden. Next, filter the segment list for ones you’ve only ridden once or twice. Next, check the weather to find out the prevailing wind. Now find some segments where the wind will be behind you. Then plan your ride to include those segments. And finally, ride the route!

And voila, chances are you will beat one of your PRs on that ride. And even better, you’ll probably have ridden some new segments during your ride, which will now be in your list of potential PR segments.

I’ve been using this technique for the last few months and I’ve still got 140 segments within 5 miles of my house that I’ve ridden 2 or less times. Of course I’ll get bored of this at some point, but maybe I’ve already found my next motivation tool (aside from Ride London 2015), increasing my Eddington number

Saturday, January 31, 2015