Categories
tech

MPs’ Web Sites

When I set up Planet Westminster in 2006 I thought it would be a relatively simple project to maintain. Over the years, more and more MPs would start blogs. Every couple of months I’d add the new ones and everything would be great.

It hasn’t worked out like that at all. MPs’ web sites have proved to be really difficult to keep track of.

The problem is, of course, that the vast majority of MPs have absolutely no idea how web sites, blogs or web feeds work. That’s to be expected. What’s less expected is that many of them seem to get round that problem by delegating the work to people who also have no idea how web sites, blogs or web feeds work.

I’ve just done a clean-up of the feeds I’m currently monitoring. Here are some of the problems I’ve dealt with.

A few MPs (including Douglas Carswell and Caroline Lucas) changed the address of their web feed. Just changed it. No notification as fas as I can see. No attempt to redirect the old address to the new one. Just an old address returning a 404 error. Anyone who was subscribed to the old address would have just stopped getting updates. It’s almost like they don’t want people to follow what they have to say.

Ed Miliband’s web site has just ceased to exist. It now redirects you to the main Labour Party web site. Because the leader of the party obviously has no constituency responsibilities. Or something like that.

John McDonnell seems very confused. In 2007 he had a web site at john4leader.org.uk. In 2010, he was at john-­mcdonnell.­net. Both of these sites are now dead and he’s at john-mcdonnell.net. It’s like no-one has told him that you can reuse web site addresses. I wonder what he’ll do once he’s run out of variations of his name on different top-level domains.

Eric Joyce has just lost control of his domain. His ericjoyce.co.uk address currently goes to an unfinished web site campaigning for “John Smith for State Senator”. It doesn’t look as though Joyce realises this as he’s still promoting the web site on his Twitter profile.

Then there’s Rory Stewart. His web feed was returning data that my RSS parser couldn’t parse. Taking a closer look, it turned out that it was an HTML page rather than RSS or Atom. And it was an HTML page that advertised an online Canadian pharmacy pushing Cialis. Not really what an MP should be promoting.

Stuff like this happens all the time. MPs need to take more notice of this. And they need help from people who know what they are talking about. My theory (and it’s one that I’ve written about before) is that MPs’ web sites and blogs are often overcomplicated because they are developed by companies who come from a corporate IT background and who dismiss the possibility of using something free like WordPress and over-engineer something using tools that they are comfortable with. It can’t be a coincidence that many of the worst MP web sites I’ve seen serve pages with a .aspx extension (sorry – only geeks will understand that).

I’m going to repeat an offer I’ve made before. If any MP wants a blog set up for them,then I’m happy to help them or to put them in touch with someone who can help them. It needn’t be expensive. It needn’t be complex. But it can be very effective. And it will work.

Update: Eric Joyce replied to me on Twitter. He said:

Thanks. It’s being worked on and they seem to have pointed it at an obvious specimen page.

Categories
tech

More Planets

Over the weekend I found time to rebuild the rest of my missing planets. I’ve resurrected Planet Balham (Atom), Planet Westminster (Atom) and Planet Doctor Who (Atom). They all have Atom feeds available as well.

This has been an interesting test of Perlanet (my simple planet-building program). When building planet davorg, I was only using feeds that I had some kind of control over. It was therefore pretty simple to ensure that the web page created was valid HTML (though, due to some bugs in the Perl modules I’m using, the same can’t be said of the Atom feed). But with these new planets, I’m aggregating feeds from all sorts of places and am seeing problems that I hadn’t seen before. In particular I’ve changed Perlanet to deal with the cases where the feed can’t be downloaded for some reason (I think that some of the MPs on my list have stopped blogging) and where the feed isn’t valid.

There are also plenty of examples of feeds that have some pretty mad HTML in them which are breaking the layout of the output pages. On Planet Balham there seems to be some broken HTML that is badly effecting the <div>s on the page, moving the Google Adsense block halfway down the page. Also, the second half of the page is currently in italics due, I suspect, to an unclosed <i> tag. On Planet Westminster there’s also some kind of problem which means that the names of the feeds change size halfway down the page.

So it’s clear that I need to add something to clean up the feeds. I’ll probably look at using HTML::Tidy or HTML::Scrubber (perhaps both). Expect some better looking pages in the next few days.