Archive for 2009

Internet cold turkey

Thursday, April 16th, 2009

I’ve recently just come back from 5 weeks holidaying in South America. I left with the intention of going totally internet cold turkey for this period of time. I wasn’t going to actively seek out the hotels computers or an internet cafe to check my emails, update twitter or facebook, or even upload any holiday photos to Flickr. I wanted to spend my time enjoying where I was going & leaving the world behind for a bit.

That really didn’t last long. Unlike Australia, I discovered that everywhere I went was bathed in free wi-fi networks. Most of the hotels I stayed in had free wi-fi, if it wasn’t in the rooms, it was in the lobby & dining areas. Cafes, ice cream stores, bookshops and even many town squares had free wi-fi. This was common throughout big cities like Buenos Aires and Lima, and not quite as common, but still there in smaller towns like Cusco, Puno, Potosi and Sucre (in Peru and Bolivia).  It was these smaller towns that really surprised me. They may not have had paved roads, but they had good internet connections.

Free wi-fiWi-fi zone

Despite being nearly connected the whole time I was away, I purposefully avoided checking things too much. I took an iPod touch with me, which was the perfect traveling companion. When I felt like it I could check email and update twitter and that was about all I did. The news at home was irrelevant, I was in some magnificent countryside and I wanted to experience that, not a computer screen.

While I was away, the world kept ticking along, I was just oblivious to most of it and it really didn’t matter. I didn’t miss anything too urgent and I felt healthier by the end of it.

A few important things did happen while I was away. The Powerhouse Museum nominated my Then & Now mashup for a “Best of web” award at the Museums and the Web conference – thank you. I was also asked by the State Library of New South Wales to present at their reference@the metcalfe seminar for NSW public libraries in May. 

I’m now firmly back into my regular routine of being connected with the world as much as possible. Is it better? It’s different and I now find I can switch off a little easier than I could a couple of months ago.

Backing up the cloud

Sunday, February 1st, 2009

Yesterday Ma.gnolia had a total failure and everyone’s accounts and the data they had added became inaccessible from both the standard web interface and via the API. In December, Pownce closed down, and in January Google announced it was closing down Google Notebook, Dodgeball and Jaiku. Users are always getting used to seeing the regular appearance of twitters fail whale. With these closures and interuptions to services, can we still rely on storing vital data “in the cloud”?


Stephen Collins has the right idea and has been duplicating his bookmarks in both magnolia and delicious.  That’s something I haven’t been doing so I’ve been affected.  At this point in time I don’t know what I may have lost, probably not much as I tend to not use ma.gnolia much. But it does raise the issue about storing data “in the could”. This is now the second time in a 2 months that I’ve been burned by storing what might be considered vital data outside of my immediate control. I recently had an application for a grant that I was working on in Google Docs. I hadn’t enabled Google gears and the night before it was due my ISP had issues & I had no internet connection. It’s my own fault for not having a backup as there were many points of failure that prevented me from accessing the information I needed.

We all know that we should have a regular back up regeime for our computers – back up everything (music, documents, email etc) on to different media and store it in different locations. I think it’s time that we add in a backup regeime for our online storage as well.

  • Gmail, hotmail, Yahoo mail etc – Get your desktop email program to sync with the web application and store a copy of your web based email with your desktop email client.  Here are instructions for forwarding your Gmail or Yahoo mail to another email client.
  • WordPress – WordPress can create an XML file of your data that can be used to import into a new instance of WordPress or into another blogging service. Log in to your WordPress blog and go to Tools -> Export.
  • Flickr – As I only upload about 10% of my photos to Flickr, I still have all the originals backed up normally. To backup tags, comments and other data that has been entered directly into Flickr, use a program like FlickrEdit to backup your photos and data.
  • Twitter – Visit tweetake and select what you would like to backup. It will download a CSV file with your information (although you have to trust them with your username & password).
  • delicious – if you are using a Mac, you can type the following into the terminal and it will save an XML file with all your data into your home directory (of course change your username and password to match your account details).
    curl --user accountname:password -o Delicious.xml -O ''

These don’t cover all the services out there, and there may be some better ways of doing things, but I hope these little tips help you to keep some of your data safe while you are using “the cloud”.

New York then and now

Tuesday, January 6th, 2009

I’ve been playing around with yet another Flickr Commons then and now project, this time using the images of New York from 1935-1938 from the New York Public Library.  The process for this has been a little bit different to the previous then and now demonstrations.  The images that have been posted don’t have any geo-location metadata (a latitude or longitude) so they can’t be placed directly on a map in the same manner as other Commons photographs.  What they do have instead, is very good street addresses in their titles.

The google maps API has geocoding API call that translates a human readable address into a latitude and longitude.  So if we pass the title of a photo into the API – let’s say “Willow Street, No. 113, Brooklyn”, it returns the latitude and longitude of “40.6978614, -73.9955804”.

For the demonstration I’m using a KML file.  Generating this file is now a 2 step process, import the data from Flickr using their API, pass the title of the photo into the Google Maps API to get the latitude and longitude and merge both results into a KML file.

Of course some of the titles provide ambiguous addresses or don’t provide enough information and don’t automatically return a result.  for some of the images I’ve manually tweaked the data that I’ve passed into the geocoding API to obtain a result.  The results are by no means perfect, but it’s a pretty good demonstration of what can be achieved from very little data and automating everything.

Please explore my New York then and now mashup and let me know what you think.

New York then and now