eParticipation and webantennE
Conventional eParticipation initiatives involve "the use of information and communication technologies to broaden and deepen political participation by enabling citizens to connect with one another and with their elected representatives" ( Wikipedia
). Ideally, these will "help foster communication and interaction between politicians and government bodies on the one side, and citizens on the other. Internet, mobile phones and interactive television can be used to channel information to citizens and canvass their views" ( Europe's Information Society
Initial webantennE proposals focused on identifying particular source sets that represent on- and offline localities as well as issue areas, and aggregating or summarizing content for each. Rather than create spaces, portals or social networks to invite citizens' perspectives on any range of topics, the aim was to dynamically discover both issues and issue overlap among sources.
Proposed channels included Nederlogosphere, Political Nederlogosphere, Dutch NGO sphere and Local Bloggers. webantennE was completed as Issuefeed.net
Building a source set: A Webantenna use case
The German chancellor's office, like many governments all over the world, receives a stack of newspapers and magazines early every morning. The staff browse through the papers, and cut out the most important articles. The articles are placed in a newspaper clippings file, and handed to the analysts and the speaker, who together prepare for the press conference later that morning. They anticipate the questions of the day, and prepare answers as well as talking points.
The above story illustrates a particular method of collection (all the major newspapers and magazines), and a method of analysis (cutting out the most important articles). One could think of the Issuefeed in terms of a similar practice.
These days the question often arises in the same chancellor's office about the major websites. Which websites should be browsed? How does one know a site is important? In order words, it may not be clear to the analysts how to build a list of relevant sites, like the list of relevant newspapers.
How to build a source set? The city of Almere, the Netherlands, provided fifteen local URLs that the governmental analysts determined were worth following, in order to have a sense of what issues are arising locally. These Websites included the local political parties, the local news channels, as well as neighborhood groups. How does one know that there are no other significant URLs?
The fifteen URLs were placed in the Issue Crawler
network location software, which undertook analysis of the Websites' hyperlinks. According to the links out of the sites, many additional relevant sites were found. (The newly discovered sites received links from the original fifteen -- see map below.) The new, longer list was sent back to the city staffmembers, who checked each URL's relevance, e.g., whether the sites were active and concerned Almere. Some were removed. In all a final list was made, listed below.
Each URL is entered in Issuefeed.net, and a channel is created.
Issuefeed checks each URL for an RSS feed. All RSS feeds are read into the analytical engine, which is a trained POS-tagger that fishes out noun phrases. The phrases occurring during the last week are compared to the previous time period, using the log-likelihood algorithm.
The Issuefeed outputs significant language on the Websites about the city.
For a description of the software, see about Issue Feed
The new source set. Almere-related Websites, November 2007, inputted into Issue Feed.
, view all tags