In the aftermath of Hurricane Katrina (Monday, August 29, 2005), grass-root efforts to help scattered family members find each other seems have nothing to do with shopbots. But it turned out the process of those volunteers in setting up websites and sifting through web sources to find and aggregate people information is the exact process of what a shopbot does.
Initially, Red Cross, Craigslist, Yahoo and Google etc. established their own sites for people to search. All kinds of non-profit organizations including university are launching their own sites to help too. To increase the chance of finding each other, people post their search ads in some or all of those sites as well as well-known public forums. It soon turned out a central list is necessary to make this searching process more efficient.
On Friday, September 2, a group of tech-savvy volunteers launched www.katrinalist.net and they used "screen scraping" to capture relevant information from as many websites as they could found and apply the search algorithm. They also authored an open source data spec for organizing the missing person's information, the
PeopleFinder Interchange Format.This step resembled the initial stage of shopbots development, when automated shopbots like BargainFinder were designed and released to automatically search and aggregate product informations from the Web.
However, they soon realized the are still a lot of information can not be processed automatically because such information were not structured at all. So in the next step on Saturday morning, they recruited more volunteers to manually data-coding the unstructured message on various sites and sift through all the missing person posts. To facilitate the process, they built a wiki site to dole out chucks of data to be parsed. As a result, by the end of the day, thousands were volunteering, and in total some three thousand are though to be contributed. Salesforce.com contributed a more robust back end server because the makeshift database was overloaded.
By Monday evening, 50,000 entries had been processed. the number eventually reached 650,000. People could enter a name, a zip code, or an address into a search tool to get an instant list of names matching their query. Over one million such searched were conducted in the immediate aftermath of the hurricane.
The next step of katrinalist.net team efforts resembled what shopbots are currently doing to collect product data - they ask online vendors to contribute the data (data-feeding), which are combined with their own efforts of screen-scraping. Actually, the majority of the data being collected by shopbots were now via data-feeding.
One of the constant theme before more than 60% of the Web pages possess semantic like structure will always be how to retrieve data efficiently. The heroic efforts of katrinalist.net team gives us a perfect example of how people could collaborate and using existing technologies for such a technical challenge.
Reference:
Katrinalist.net and the Peoplefinder Project