05 AUGUST 2010
by JAMES SILVER
24 hours into his deployment in Haiti, as part of the Red Cross’s first rotation after January’s 7.0-magnitude earthquake flattened much of Port-au-Prince, Kjeld Jensen was busy making maps.
His unit had arrived by bus from the Dominican Republic the day before and the driver had promptly lost his way. “We lost at least three-quarters of an hour, I’d say,” says Jensen, who was heading the joint Danish-American IT & Telecom Emergency Response Unit. “And with eight to ten people on the bus, that adds up to quite a few lost [person] hours.”
Of the logistical headaches of disaster relief work, mapping is the most urgent. If they’re lucky, Red Cross responders are handed basic maps printed out from the internet before they go, says Jensen, who is now back in Norway. But on the ground, they have to work closely with local teams, MapAction, which specialises in drawing up maps for disaster areas, and the UN. They use Google Earth to plot the locations of camps for internally displaced persons [IDPs], logistics warehouses and radio repeaters. “You can hack a KML file [which allows users to customise Google maps] in minutes and share it with others,” he says.
As unit leader, Jensen quickly built up his own database of GPS waypoints for camps, water pumps, logistics hubs and the airport. But shortly after his arrival, word reached him about OpenStreetMap (OSM), an open-source, worldwide editable mapping project, and this was to transform his mapmaking efforts. In the immediate aftermath of the earthquake, a small army of OSM volunteer mappers across the world had set to work using high-res satellite imagery to map Port-au-Prince and the devastated city of Carrefour. Within 48 hours both cities had been comprehensively mapped. News of OSM’s feat spread quickly among crisis responders.
“The link to the maps was circulating just a few days after we arrived,” says Jensen. “It was new to us, but I installed their maps on the Garmin [GPS] units, which supported it.” When the Red Cross switched the location of its main base, Jensen and his driver got lost trying to find the new HQ. “I had OSM on my Garmin Oregon. I radioed to a colleague who knew the location and we easily found it on the map.” He adds: “[OSM] was a big time-saver for me several times. It made a difference in this operation.”
Many of OSM’s volunteer mappers see themselves as part of a worldwide “open geo-data” movement. Maps have traditionally been restricted by copyright and are often expensive to acquire. OSM, however, was conceived as “Wikipedia meets maps”, aiming “to map the world and give the data away for free”. The “Wikification” of mapping appeals to 30-year-old, London-based web developer Harry Wood, who’s been mapping for OSM since 2006. “To me, it’s about releasing the data, making sure the underlying data is free,” he says. “So many exciting technological things can happen when geo-data is released.” He was one of “about 300” volunteer OSM “armchair mappers” who mobilised in the hours after the earthquake and set about mapping the disaster zone, street by street.
When news first broke, Wood had yet to grasp the scale of the catastrophe. He began by “doodling in” streets visible in Yahoo imagery, which was made available to OSM in December 2006. When the true picture began to emerge, work by OSM mappers began in earnest. Geo-data was taken from a variety of sources ranging from Yahoo to old CIA maps — the access to much of it arranged through the Crisis Mappers network of open data mappers. The response was 1 0 4 co-ordinated through online channels, including an OSM Haiti WikiProject page.
“Quite a few people in the community, me among them, had a feeling that OSM had a lot of potential in a disaster-recovery situation, “Wood explains. “When the earthquake struck we looked at what could be traced in Haiti from Yahoo. Yahoo had reasonably good coverage and no one had used it much. But 24 hours after the earthquake, DigitalGlobe and GeoEye [satellite-imaging services] also released aerial imagery at a much higher resolution than Yahoo’s. That made a huge difference to us.”
OSM’s efforts in Haiti, and beyond, continue. Mikel Maron, instigator of OSM’s response to the earthquake and chairman of its data working group, drew up the Humanitarian OSM Team’s strategy for Haiti. Since the initial flurry of post-quake mapping activity, the organisation has three times sent teams to the country to increase the project’s visibility among NGOs, run training workshops and improve the quality of the data being gathered.
OSM mappers were called into action again when an 8.8-magnitude earthquake off the coast of Chile took place just weeks after the Haiti disaster. This time, however, access to high-quality satellite imagery was heavily restricted, hampering their effort. “OSM already has an awesome global network, that’s key,” says Maron. “But satellite imagery is so important. In Chile, the channels to receive imagery free of licensing restrictions were not as open, so we’re looking for partners in that, especially at the UN. We want to be ready for the next crisis.”
OpenStreetMap was launched on August 9, 2004, by University College London “dropout” Steve Coast, who began what he calls his “bedroom project” after growing frustrated at the lack of royalty-free maps in the UK. “I had a GPS unit and a laptop with Linux, and there were various bits of open software to talk to the GPS, to plot your position, but there was no data,” says the 29-yearold, who is now working on various Linux and open-data projects from Denver, Colorado. “So what I ended up doing was downloading copyrighted map pictures from sources like Microsoft MapPoint. They were pretty useless, because they were images not data, so I figured it would be easy to make my own map of central London. And if I could make the software that could do that for me, then I could just open it up for everyone else and we could make a map of the world, jigsaw-puzzle like.”
Today, the non-profit OpenStreetMap Foundation claims over 260,000 members in 30 countries. It has also spawned a commercial sibling, CloudMade, which was launched in 2007 by Coast and business partner Nick Black “to provide services on top of OSM”. Expanding the open-map project remains at the heart of the foundation’s purpose, but the community has also launched projects such as OpenCycleMap.org, which lists national and regional cycle routes across the world, and is producing iPhone apps such as Atm@ UK, which locates your nearest cashpoint.
Less than three weeks after taking office, David Cameron sent a letter to all government departments demanding that they “open up” data. “Greater transparency across government,” he wrote, “is at the heart of our shared commitment to enable the public to hold politicians and public bodies to account; to reduce the deficit… and to realise significant economic benefits by enabling businesses and non-profit organisations to build innovative applications and websites using public data.”
Among the commitments the prime minister made was the promise to publish online: all “new items of central government spending over £25,000” from November this year; “full information on all DFID international- development projects over £500” from January 2011; and, within the same timescale, crime data “at a level that allows the public to see what is happening on their streets”.
Street-level searchable data maps and charts are increasingly popular in the US. City-data.com, which claimed 15 million unique visitors in March 2010, allows users to search cities with 6,000 residents or more for data ranging from population density and racial make-up to the number of registered sex offenders and building permits issued for “single-family new houses”.
Meanwhile, Oakland Crimespotting, a crime map of Oakland, California, updated daily from police reports, has shown how data can be useful at a neighbourhood level. As well as providing an online map, the site allows residents to sign up for email alerts and RSS feeds, and functions as a browsable crime database.
In the UK, the process of letting daylight seep into previously guarded public data began under Gordon Brown, who, in 2009, asked Tim Berners-Lee, the computer scientist credited with creating the web, what could be done to make better use of the internet. Berners- Lee’s much-quoted response — “Put all your government data on to the web” — set in motion a chain of events that culminated in the launch, in January 2010, of data.gov.uk. With access to a deluge of government-held non-personal data, developers could now create apps, mashups and visualisations. Its far bigger US equivalent, data.gov, launched in May 2009, now offers 272,677 datasets for reuse, with 236 applications already built.
Whether it is searchable crimemaps or real-time trip-planner apps, the freeing up of government information has brought about a data revolution, shaking up existing models and changing lives.
At the Institute of Medicine in Washington DC in June, developers unveiled 16 projects which aimed to “harness the power of information to improve health”. Among them was a web portal built by the Network of Care for Healthy Communities in Sonoma County, California, a health-advocacy organisation. By aggregating government data into a community dashboard, the site helps decision-makers design effective and trackable health policy.
This data-led, web-inspired movement — branded Health 2.0 or open medicine — is boosting patient-power too. Network for Care’s portal lets patients choose practitioners. PatientsLikeMe is a free data-sharing social-networking platform on which 70,000 registered users share information and “real world, outcome based patient data”, on conditions from MS to HIV and depression.
“We’re giving patients the power to collect information to help them manage their illness,” says PatientsLikeMe co-founder and chairman Jamie Heywood. “We build a data-framework model of a disease, and let patients fill it in.” Data gathered on the site is put to valuable use, he says. “We’ve done clinical-trial recruiting and collaborations with academia, and we sell pharmaceutical companies information and services.”
The deaths of 400 patients at StaffordHospital, 50km north-west of Birmingham, between 2005 and 2008 is now the subject of a public inquiry. But the revelation that patients had died unnecessarily after undergoing treatment at the hospital might not have been picked up at all were it not for data gathered by medical-intelligence company Dr Foster. Using NHS-derived datasets, it found that the hospital had overly high mortality rates, which prompted a full investigation.
The tale of a company which set out to gather, analyse and repackage government data — and now provides intelligence back to the NHS — is an inspirational one to Christopher Osborne, business-development director at transport-technology company ITO, which provided the visuals on these pages.
“[What Dr Foster] did is very similar to our mission to provide transport intelligence to government,” he says. But ITO — which offers “web-based services… using state-of-the-art visual-effects techniques” — is up against a formidable obstacle: namely, the fact that the open-data revolution has yet to impact significantly upon the UK’s public-private transport system.
“It’s a sorry state of affairs,” sighs Osborne. “There’s so little transport data out there. And there hasn’t been much discussion about it. This country is currently trying to make huge transport decisions — High Speed Rail 2 involves a minimum of £20 billion — but we’re doing it without data about how people actually use transport. We just make educated guesses at it.”
To inform such decisions, ITO has been working on Ideas in Transit, a five-year research project. “We’re trying to step back from large-scale, top-down transport projects to ask how we apply user innovation to transport. Very quickly it emerged that there’s a thriving developer community out there for transport, but it’s totally reliant on access to data.”
Osborne contrasts the situation with the US, where dozens of public-transport related apps have sprung up, using data from the 113 transit authorities whose “open” policies allow developers to reuse their information. “We don’t have that ecosystem yet in the UK because most of the data is still locked away,” he says. The situation in Britain is changing, however. In mid June, Transport for London (TfL), the body which runs most of London’s public transport system, lifted its restrictions on the commercial reuse of its data by software developers, as well as releasing several new datasets including feeds from live travel news and departure boards. By July 1, one of these, the London Underground departure-board feed, which gives live train information, had had to be withdrawn after receiving more than ten million hits per week. As wired went to press TfL was unable to say when the feed might be reopened.
That aside, Jonathan Raper, professor of geographic information science at City University and advisory-committee member of the London Datastore, which supplies data sets from the Greater London Authority, says future releases will be even more meaningful and could include anonymised travel-pattern data derived from RFID-enabled Oyster cards. “That’s absolutely on our agenda. It’s accepted by TfL that Oyster data will be released.” (TfL says there are currently no plans to release Oyster-related data.)
Revealing real-time usage data from transport networks would certainly help in determining, say, when and where to run more trains. Traffic-count data sets released by data.gov.uk — which is gathered by people counting vehicle flow with clickers for one day a year-enabled ITO to create a UK traffic “heat map” of the years between 2001 and 2008.”In that time, the numbers of cars increased across the UK,” explains Osborne. “But if you look at the traffic count between those years there’s little change — apart from in London.”
In the capital, the visualisation — a cluster of multicoloured dots — thins out over the seven-year period (see previous spread). “The data shows that the most successful scheme in UK transport in the past ten years was London’s Congestion Charge. When you go through the data and do some clear visualisations, you can see that there’s been a huge decrease in the cars-and taxis category right across London.”
From better-informed government decision-making to software developers building apps, the digital age is rewiring daily life. With raw public data freeing up, Berners-Lee, the man who kick-started the UK government’s open-data revolution, hopes the momentum will now be “self-driving”. Next he plans to push to free up other datasets. “What should it be in 2010?” he asked earlier this year. “Putting government data on the web has been a very exciting journey. We have to keep pushing, though.”
OPEN-DATA MASHUPS IN ACTION
When an Icelandic volcano which had been dormant since 1823 erupted on April 14 2010, sending a plume of ash into the atmosphere, swathes of European airspace were closed for seven days. At the height of the crisis — on April 18 — there were just 5,204 flights in Europe, compared with 24,965 on the same day a week earlier.
Overnight, Eyjafjallajökull’s angry ejections transformed flight-data and- mapmashup Flightradar24.com (FR24), which tracks live air traffic, from geek-zone online curiosity to the go-to site for stranded passengers searching for data with which to plan their routes home.
FR24’s visualisation (below) — clickable yellow planes, with coloured trails signifying altitude — brought the crisis to life. “Before the ash cloud we used to have between 50,000 and 70,000 visitors a day,” says Mikael Robertsson, the site’s Stockholm-based creator. “[During the flight ban] we were suddenly attracting one million a day.” FR24 works by tracking the progress of aircraft fitted with ADS-B transponders (about 60 per cent of passenger aircraft, including all models of Airbus planes, Boeing 737-787 series and some series of Fokker, Gulfstream and McDonnell Douglas jets), via a network of people with ADS-B receivers around Europe. “[People who want to join the network] send us an email and we send them a small script/software that they run on their local computer,” explains Robertsson.
From IP addresses in FR24’s log-files, Robertsson knows that many of the site’s visitors are from airlines, including SAS, Lufthansa, Norwegian, Finnair, airBaltic, Air France and BA. “We also know that airport employees are using FR24 to know when planes will arrive,” he says. A small Google AdSense banner covers “almost 100 per cent” of the running costs. JS
As a frontline scientist with the Atlanta– based US Centres for Disease Control (CDC), David Van Sickle was part of a team of epidemiologists called in to investigate unusual outbreaks of disease. But there was one disease Van Sickle never got to investigate during his time with the agency — asthma. “That was pretty startling,” he says, “when you think it’s one of the most common chronic diseases.”
When he left the CDC, Van Sickle began asking what had prevented the team from investigating asthma. He came to the conclusion that the omission was due to the poor quality of public- health surveillance data. “The data we had was focused mainly on hospitalisations and death, which we often got years after those had occurred, so it was neither timely nor geographically specific.” In response, he developed Asthmapolis — a data-gathering and disease-mapping project.
There are two main strands to Asthmapolis. The first is a project that tracks the conditions in which asthma patients develop symptoms, by attaching a device known as a Spiro Scout to their inhalers. This uses GPS to ascertain where and when an inhaler is used and sends this data to a remote server. A second project uses web-enabled mobile phones to map and track symptoms, triggers and the use of inhalers and other medications in a digital diary. The Asthmapolis website then summarises patterns of use and trends over time, with maps, tables and charts.
“We then try to identify what we can — based on other things we collect such as activity patterns, data from individuals about where they work, their job, where they live.” Van Sickle says the studies have been revelatory about when and where people have asthma symptoms. “Whereas public health has often focused on exposures within the home — dust mites and pet dander [flakes of skin] — we discovered that patients were more often using their inhalers in other locations such as at school or in town,” he says. “We’ve also seen interesting data patterns in the time of day when asthma occurs.” JS
Real-time bus updates; personalised “walkability mapping”; real-estate searches by transit stop — these are just three of the 132 mobile apps on the US site City-Go-Round.
Launched last December, the site is a pocket revolution of local transport apps which use open data from the 113 transit agencies to have released their information to date. “We think that great software is one of the easiest ways to make public transit better to ride and more efficient,” says Matt Lerner, chief technology officer at Front Seat, a “civic software company”. “In Seattle, where I live, there’s an app called One Bus Away (above) that tells you exactly when the next bus will arrive, so you don’t have to wait at the stop for 20 minutes.”
The transit agencies — including a handful in Canada and Australia — make their data available in General Transit Feed Specification, a format that was created by Google specifically for transit information. “This has become the standard format for transit data and is easy for programmers to work with,” says Lerner, a former lead programme manager for Windows Vista at Microsoft. Data from City-Go-Round is currently viewed by over one million unique visitors per month on its sister site Walk Score — a popular site which calculates the “walkability” of any address. JS