Tuesday, October 27, 2009

Perils of online services

The online world reliant on distributed services is such a frustrating place. The events of the last few days totally drained my enthusiasm for “adding value” and “mashing up” all that free content available on the Internet. I feel like I am running in circles, constantly adjusting old applications to keep up with endless changes to browsers and data services, rather than moving forward. A slap of reality, I guess… And don’t get me started on that I should keep my applications “future proof” and stick to “standards”! I try to but others don’t follow the same rules!

How can you possibly future proof for IE8 not being able to handle screen clicks (and hence requiring yet another, IE8 specific, hack to make Google Map working correctly)? How can you proof your javascript code to read CSS “background” property consistently in all browsers (it happens that FireFox, Internet Explorer and Opera handle it totally differently!). There are millions of small things that collectively make the life of online application developer pretty miserable… unless someone else is paying handsomely for her/his time!

Not even the largest Internet websites do it all by the book. Just open your javascript console and see how many problems are reported by your browser when you visit popular pages. So, I am not the only one struggling to make it all work, somehow and despite…

However, the biggest cause of frustrations for me are data/ service providers. My entire website is built with third party services and content. That is what I set out to do from the start – a big experiment trying to determine whether it is a viable model for online existence. There are many sources of free information available for mashing up but there is a big cost of continuous maintenance to comply with the latest updates.

Just this Monday I found out that all my weather information pages are not working because Bureau of Meteorology services stopped responding! I am a legitimate subscriber not a site scraper (although that second option crossed my mind many times as scraping would be more reliable!). I am yet to hear as to causes and whether it is just a temporary disruption or else… And to add to my frustration, YouTube also happen to update their URL for playlists which mean I will have to put more work into Online Video Player application (the old URLs still work but users cannot play any newly created playlists). Then there are several GeoRSS feeds I use in my Hazard Monitor that also changed format and the information is no longer showing up on my pages….

I will risk the statement that examples quoted above are the proof that interoperability on a global scale does not work! That is, as long as suppliers only have their own interest in mind (ie. “caveat emptor” principle - take my feeds/services but we will keep changing them as we see fit) and as long as information/ services are supplied on “as is” basis, there cannot be any viable online interoperable environments. Because there is no guarantee that the services will be there when you need them (as I experienced during Victorian bushfires in February 2009 when fire hotspots data services were unable to cope with the demand. And who says that Yahoo, Microsoft or Google cannot stop serving maps, emails etc? Too big to fail?) Yes, it all seem to work ok most of the time but only thanks to myriads of hacks and $billions and uncounted hours spent on maintenance. We learnt to live with less than optimal arrangements but it doesn’t mean it is the proof that “all is fine”…

Let me quote another example. OGC web map standards were developed in early 2000’s but we are yet to see globally consistent deployments in a fully interoperable fashion – with proper and interoperable data discoverability portals (and metadata!), service delivery undertakings from suppliers, and authoritative and comprehensive information sources. Don’t get me wrong, US Government is doing it, EU is doing it and in Australia SLIP and AuScope projects are good examples of where OGC standards were put to good use. But these only work because they are implemented in tightly controlled environments (ie. end to end implementations, from access to source data, through to cataloguing and dissemination)! These are not collections of random service nodes but only nodes that comply to that particular environment standards. My recent post summarising Ed Parsons thoughts on Spatial Data Infrastructure has more on the issue if you care to read on.

Anyway, enough of grievances for one day. Back to hacking my way through the problems!