Friday, December 30, 2011

Why all the free ads for Facebook?

A while back I picked up a copy of the Evening Standard and was surprised to see Facebook and Twitter logos at the top of every page, suggesting readers follow ES on these two websites. And this is just an extreme example of what I’m seeing more and more. Ads on the telly and in newspapers no longer show the URL of the company’s website, but the URL of their Facebook page instead.

But the thing that confuses me is why big companies would choose Facebook as their main point of contact with their customers? Sure, social networks are the big thing at the moment and getting users to follow you or like your product might have some benefit, but there seem to be a number of downsides.

First, who owns all the data being collected about your customers? My guess is Facebook. Can you extract that data if Facebook decide they don’t want you anymore or you decide to move? I guess it’s probably possible via the Facebook API but it seems somewhat risky. And even if you do extract it, Facebook will no doubt keep hold of it as well.

Then there’s the question of ads. The company pages I’ve seen on Facebook seem to have the same ads as any other page. I’ve found no indication that companies get any of the income from those ads, so why drive traffic to Facebook so they can make money from your brand? And what if Facebook decide to show ads for one of your competitors?

Frankly it all seems a bit odd. Big companies have big IT departments and generally have their own websites, fully under their control. It is pretty simple to add some Facebook widgets to your own site and get integration that way which seems a more sane approach if you want to get hooked into Facebook.

For a one man band kind of company, I can see the sense in putting you web presence on Facebook, it’s a lot simpler and cheaper than building your own website, but for multi-nationals my prediction for 2012 is that this is something they do much less of.

Monday, December 26, 2011

I am Sheldon…

Sheldon / Doogal

…with a somewhat larger waste line, somewhat smaller IQ and hopefully with less OCD tendencies.

Friday, December 02, 2011

Goodbye Google Friend Connect

I’ve been disenchanted with Google Friend Connect for a while. I only used it for commenting on my website but it has a number of weaknesses

  • A crappy user experience
  • No notifications of new comments, which is a pain if you only receive a handful of comments
  • Ignoring query strings in URLs, so comments don’t stick to the right page
  • Weird date formatting and no control over how they appear

For a long while I assumed Google would actually update the widgets but I’m not sure anything ever got changed after the initial release. It was released and then, nothing. Even a visit to the site shows a copyright notice from 2009, which suggests they haven’t done much with it for some time.

So I had been meaning to convert to some other system but just hadn’t got round to it. Then I noticed Google have decided to can it (which of course hasn’t been mentioned anywhere on the Friend Connect site itself), so I decided it was time to finally do something. Google’s suggested solution is to hook into Google+, but that seems like a pretty useless way to add commenting to a website. So my suggested solution is to sign up to Disqus, it takes about 10 minutes to plug it in to your website and looks pretty good straight out of the box.

Tuesday, October 11, 2011

Google Maps in a desktop app

I’ve seen one or two examples of using Google Maps in a WinForms desktop app, but the ones I’ve seen seem to involve loading image tiles from the Google server directly. There’s nothing wrong with that approach but I thought it would be a lot simpler to simply host a local web page using the Google Maps API in a WebBrowser control in an application. Here is a very simple example of this idea.

If you want to extend this example, it’s possible to call scripts in the page via the WebBrowser.Document.InvokeScript method and the application can respond to events in the page via the WebBrowser.ObjectForScripting property.

As an aside, similar ideas can be applied to hosting other Javascript web components, such as an HTML editor like CKEditor.

Thursday, July 14, 2011

Google Maps and Friend Connect weirdness in IE8

I received a bug report about my website. The maps that appear on my UK postcodes pages weren’t working in IE8. I hadn’t noticed since they were working fine in IE9 but when I switched to compatibility view in IE9 I started to see the same problem. This helped because I was then able to debug my JavaScript (much as I love IETester, I’d love it even more if I had access to the developer tools for each version of IE). And debugging the script revealed google.maps was null.

At this point I assumed this was a problem with my Google Maps script, so tried loading Google Maps asynchronously then tried specifying an older version of the API. Neither helped. I tried inserting the script in my HTML head but with no luck.

Finally I had a look at the google object and saw the only thing defined in there was friendconnect. Now my suspicions moved away from Google Maps to Google Friend Connect. my Friend Connect stuff appears at the bottom of each page and the script reference for it was also down the bottom of the page. So I thought I’d tried moving the Friend Connect script reference to the html header. And voila, the maps started working again.

So the conclusion? I’m not really sure, although I suspect Friend Connect is removing the google.maps object, although it seems odd that it only happens on IE8.

Saturday, June 04, 2011

Retrieving the most popular pages using Google Analytics API

For a long time I’ve shown the most popular pages on the home page of my website. I did this by logging every page that was viewed to the MySql database on the back end. This kind of worked but had a few problems. First, it wasn’t very clever since it couldn’t tell the difference between a real visitor and a search engine bot. Second, since I’ve started to get quite a few visitors (no, really), it was writing a large amount of data to the database.

So I thought there must be a better solution. Figuring that all the information I needed was already being collected by Google Analytics, I thought I could grab this data and dump it into a much smaller with just the page URL and the number of visits (rather than adding a row for every visit). So I coded up a solution using the .NET wrapper around the Google Analytics API. And this is what it looks like (with the database access code removed for clarity). You’ll need to provide your own email address, password and Google Analytics account table ID to get this to work, for obvious reasons.

using System;
using Google.GData.Analytics;

namespace GoogleAnalytics
{
  class Program
  {
    static void Main(string[] args)
    {
      AccountQuery feedQuery = new AccountQuery();
      AnalyticsService service = new AnalyticsService("DoogalAnalytics");
      service.setUserCredentials("email address", "password");

      DataQuery pageViewQuery = new DataQuery("https://www.google.com/analytics/feeds/data");
      pageViewQuery.Ids = "Google Analytics account table ID";
      pageViewQuery.Metrics = "ga:visits";
      pageViewQuery.Dimensions = "ga:pagePath";
      pageViewQuery.Sort = "-ga:visits";
      pageViewQuery.GAStartDate = DateTime.Now.AddMonths(-1).ToString("yyyy-MM-dd");
      pageViewQuery.GAEndDate = DateTime.Now.ToString("yyyy-MM-dd");

      DataFeed feed = service.Query(pageViewQuery);
      for (int i = 0; i < 20; i++)
      {
        DataEntry pvEntry = (DataEntry)feed.Entries[i];
        string page = pvEntry.Dimensions[0].Value.Substring(1);
        string visits = pvEntry.Metrics[0].Value;

        Console.WriteLine(page + ": " + visits);
      }

      Console.ReadLine();
    }
  }
}

Monday, May 30, 2011

Poor man’s XSLT profiling for .NET

If you’ve ever looked round for a profiler for XSL transformations then chances are you’ve found the Microsoft add-on for Visual Studio, which looks like it’s just the ticket, if you happen to have Visual Studio Team System. But if you don’t happen to own that version, then it might look like you have to upgrade your VS license or buy some other XSLT profiler.

But if you happen to own a .NET profiler (I highly recommend AQTime) then there may be another solution. Visual Studio comes with the XSLTC tool that can be used to generate an assembly from an XSL transformation. Once we’ve got an assembly, then we can build a small wrapper application that loads up the assembly, passes it to an instance of the XslCompiledTransform class and calls the transform. And once we’ve got that, we can use a standard .NET profiler to find bottlenecks.

And as I understand it, the XSLT profiler add-on for Visual Studio works in just this way so profiling using this technique should be just as effective as the Microsoft version.

Thursday, April 14, 2011

Spotify not too good to be true anymore

Apparently I’ve been using Spotfiy for over two years. Funny, it seems longer than that. It was the perfect music service for me, unlimited music of my choosing on my PC, which is where I listen to music most of the time, with the only minor downside being some adverts that play occasionally between tracks. But it looks like it won’t be quite so perfect anymore. Free users can only listen to a track a total of five times and total listening time will be limited to 10 hours a month.

As a frequent user of Spotify I can see why they are doing this. First, it’s obvious that advertising revenue is not what they were hoping for, most of the ads are still for Spotify itself. Second, using it has had a perhaps not unexpected effect on my music buying behaviour. First example, U2 put their last album up on Spotify before its official release. I had a listen and realised it was rubbish, so as a marketing exercise I doubt it was a huge success. Second example/s, quite a few new releases are put onto Spotify Premium upon release. On a couple of occasions I’ve then purchased the album before it’s become available on the free version (Elbow, Arcade Fire if you’re wondering). I’m guessing this isn’t what Spotify wanted me to do, they were presumably hoping I’d pony up the Premium version. And then when those albums did become available on the free version (generally only a few weeks after release) I was hit with a mild feeling of regret for spending money that I didn’t really need to and deciding to think twice before making another purchase. Again, probably not what the music industry sponsors of Spotify were hoping for.

So now I’ve got a choice, sign up for a tenner a month and continue on as I am at the moment or spend that tenner on a CD every month. I guess the music industry don’t care too much which way that ten quid gets to them, so it’s purely a personal dilemma. But the people who can’t or won’t spend a tenner (teenagers, students mostly I guess) will probably rediscover the skill of searching for pirated albums on Google. The music industry is still caught between a rock and a hard place.  

Saturday, March 12, 2011

Updating the Code-Point postcode datataset in MySql

Some time ago I imported the Ordnance Survey Code-Point postcode dataset into MySql. It looks like there’s a new version of that dataset available which includes new postcodes so I wanted to update my database. I guess I could just empty the table and re-import the data, but since it takes some time import and the data is live on the web, this wasn’t the ideal solution. Fortunately, MySql has a useful IGNORE keyword which will ignore failed inserts so any old postcodes will be ignored (since the postcode is used as the primary key on the table) whilst new ones are inserted. Of course, this assumes that the latitude and longitude of old postcodes doesn’t change, which I’m hoping is a reasonable assumption. So my new code looks like this.

using System;
using System.IO;
using DotNetCoords;
using LumenWorks.Framework.IO.Csv;
using MySql.Data.MySqlClient;

namespace ImportCodepoint
{
  class Program
  {
    static void Main(string[] args)
    {
      string[] files = Directory.GetFiles(@"C:\Users\Doogal\Downloads\codepo_gb\Code-Point Open\Data");
      foreach (string file in files)
      {
        ReadFile(file);
      }

    }

    private static void ReadFile(string file)
    {
      using (StreamReader reader = new StreamReader(file))
      {
        CsvReader csvReader = new CsvReader(reader, false);
        using (MySqlConnection conn = new MySqlConnection(
          "server=server;uid=username;pwd=password;database=database;"))
        {
          conn.Open();
          foreach (string[] data in csvReader)
          {
            string postcode = data[0];
            // some postcodes have spaces, some don't
            if (postcode.IndexOf(' ') < 0)
              postcode = data[0].Substring(0, data[0].Length - 3) + " " + data[0].Substring(data[0].Length - 3);
            // some have two spaces...
            postcode = postcode.Replace("  ", " ");
            
            double easting = double.Parse(data[10]);
            double northing = double.Parse(data[11]);

            // there are some postcodes with no location
            if ((easting != 0) && (northing != 0))
            {
              // convert easting/northing to lat/long
              OSRef osRef = new OSRef(easting, northing);
              LatLng latLng = osRef.ToLatLng();
              latLng.ToWGS84();

              using (MySqlCommand command = conn.CreateCommand())
              {
                Console.WriteLine(postcode);
                command.CommandTimeout = 60;
                command.CommandText = string.Format(
                  "INSERT IGNORE INTO Postcodes (Postcode, Latitude, Longitude) " +
                  "VALUES ('{0}', {1}, {2})",
                  postcode, latLng.Latitude, latLng.Longitude);
                int count = command.ExecuteNonQuery();
                if (count > 0)
                  Console.WriteLine("Added");
              }
            }
          }
        }
      }
    }
  }
}

Tuesday, February 22, 2011

GZipping all content served up by ASP.NET

Update – I now realise this post is kind of pointless, there is a module for compression of dynamic content, called unsurprisingly DynamicCompressionModule… But the approach described may be useful for someone somewhere…

I couldn’t find anything that will GZip all the content returned by ASP.NET. There’s a module for compression of static files but nothing for dynamic content. There may be a good reason for this, perhaps the overhead of GZipping content on the fly can kill your server, but since my current project has no static content I thought it would be useful to give it a go. The solution is pretty simple, register the following module in web.config and you’re good to go.

using System;
using System.IO.Compression;
using System.Web;

namespace MyNamespace
{
  public class GzipModule : IHttpModule
  {
    public void Dispose()
    {
      
    }

    public void Init(HttpApplication context)
    {
      context.BeginRequest += new EventHandler(context_BeginRequest);
    }

    void context_BeginRequest(object sender, EventArgs e)
    {
      HttpApplication app = (HttpApplication)sender;
      if ((app.Request.Headers["Accept-Encoding"] != null) &&
            (app.Request.Headers["Accept-Encoding"].Contains("gzip")))
      {
        app.Response.Filter = new GZipStream(app.Response.Filter, CompressionMode.Compress);
        app.Response.AppendHeader("Content-encoding", "gzip");
        app.Response.Cache.VaryByHeaders["Accept-encoding"] = true;
      }
    }
  }
}
Registration is as follows
    <modules>
            <add name="GzipModule" type="MyNamespace.GzipModule" />
    </modules>

Saturday, February 05, 2011

Fixing 404 errors when using ASP.NET 4 routing

It took me a while to figure this out. Routing is meant to be baked into ASP.NET 4 but when i tried to set it up, all I got was 404 errors. I did a lot of Googling but couldn’t find anything. It turned out all I was missing was this in web.config

<system.webServer>
    <modules runAllManagedModulesForAllRequests="true"></modules>

Thursday, February 03, 2011

Northern Ireland postcode data

The OS Code-Point Open dataset is great, except for a few omissions. It doesn’t include data for Northern Ireland, the Isle of Man or the Channel Islands. It turns out that the Northern Irish postcode data can be found here. Unfortunately that data is in ESRI and MapInfo formats, which I’m not sure how to read. Fortunately Jamie Thompson has converted it to CSV, which is a little easier to deal with.

From that CSV file, it’s quite simple to import the data into SQL Server or MySql using code slightly modified from my Code-Point examples (SQL Server here and MySql here). The only thing to note is that the CSV file uses Irish grid references rather than OS grid references.

Now to figure out where to get hold of the Isle of Man and Channel Islands data…

Monday, January 31, 2011

Don’t believe everything that Reflector tells you

Every .NET developer loves Reflector, since it gives us a chance to see inside assemblies that we don’t have the source for. And I’ve even read bloggers showing off the code that has been reverse engineered by it as evidence of poor coding practices at some organisation or another (“look, these guys use gotos!”). But though Reflector is a brilliant tool, its reverse engineering skills are not perfect. See this fairly innocent looking switch statement from some code I’m working on 

        switch(type)
        {
          case "gateway":
            SetValue(component, "@type", "decision", true);
            string xml = component.InnerXml;
            xml = xml.Replace("gateway", "decision");
            component.InnerXml = xml;
            break;

          case "deliverable":
          case "dataObject":
            SetValue(component, "@type", "document", true);
            break;

          case "annotation":
            SetValue(component, "@type", "note", true);
            break;
        } 

And this is what Reflector shows from the compiled assembly

        if (CS$4$0001 != null)
        {
            if (!(CS$4$0001 == "gateway"))
            {
                if ((CS$4$0001 == "deliverable") || (CS$4$0001 == "dataObject"))
                {
                    goto Label_00B0;
                }
                if (CS$4$0001 == "annotation")
                {
                    goto Label_00C5;
                }
            }
            else
            {
                this.SetValue(component, "@type", "decision", true);
                string xml = component.InnerXml.Replace("gateway", "decision");
                component.InnerXml = xml;
            }
        }
        goto Label_00DA;
    Label_00B0:
        this.SetValue(component, "@type", "document", true);
        goto Label_00DA;
    Label_00C5:
        this.SetValue(component, "@type", "note", true);
    Label_00DA:;

Which I think proves my point…

Thursday, January 13, 2011

Better debugging of .NET services

For a long while I’ve been debugging a .NET service using the recommended approach. Whilst this works, it’s kind of painful. the steps are something like this

  1. Build the service
  2. Realise the service was already running. Stop it from the Services Control Panel applet.
  3. Build the service again
  4. Start the service from the Services Control Panel applet.
  5. Attach to the process from Visual Studio
  6. Realise the service has already executed the piece of code I wanted to debug.
  7. Goto step 1.

There had to be a better way. And a bit of Googling brought up this approach. But I didn’t really understand how it worked. I guess I’d assumed the error shown in Visual Studio when you try to debug a service was actually coming from Visual Studio, but I now realise the error is coming from .NET. So by having a different piece of code run when in debug mode, the service is treated like any old application.

My solution is slightly different so I can also test the service starting and stopping. My service implementation has StartService and StopService public methods, which are called from OnStart and OnStop. And my Main method looks like this.

    static void Main()
    {
      #if (!DEBUG)
      ServiceBase[] ServicesToRun;
      ServicesToRun = new ServiceBase[] 
            { 
                new MyService() 
            };
      ServiceBase.Run(ServicesToRun);
      #else
      WorkflowService service = new MyService();
      service.StartService();
      service.DoStuff();
      service.StopService();
      #endif               
    }