Tag Archives: Ryerson University

Journalism: Storytelling in the Geodata Age

By Naomi Bloch

The rise of more accessible geospatial web tools along with expanding sources of open data have fostered a potent—if somewhat techno-utopian—civic vision. For those immersed in understanding this new digital landscape, one question often surfaces: who’s truly putting these resources to use?

The most reliable answer is perhaps an obvious one. “Journalists are making huge use of mapping and geodata for storytelling, for the visualization of stories, and for investigative reporting purposes,” said April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

As a scholar, Lindgren’s own research employs data mapping techniques to examine the geography of news coverage and the role of Canadian media in society. “Maps have actually been quite a powerful tool for us to explore patterns of local news and understand how it works. It opened up a whole new way of getting at and understanding the data because we were able to visualize it.

“Before that, it was the old problem of columns and reams of numbers” Lindgren said. “But being able to map it allowed us to show geographically, yes, most of the news coverage is focused on downtown Toronto. So why is that? And what are the implications of not doing much coverage in other areas of the city? And furthermore, we mapped the types of topics. So what does it mean when most of the news that they publish about certain areas is crime coverage? What does that do in terms of the geographic stereotyping?”

Computer-assisted reporting revisited

Lindgren notes that the use of mapping and data analysis for actual journalistic purposes is not a new phenomenon. Over twenty years ago, in 1993, Miami Herald research editor Steve Doig won a Pulitzer Prize for his investigative coverage of Hurricane Andrew’s aftermath in Florida. The year prior, Doig and his colleagues spent several intensive months processing and evaluating two data sets—one that helped to map out property damage caused by the hurricane and another documenting wind speeds at different locations and times throughout the storm. “They noticed from using mapping that the damage was much more extensive in certain areas than in others, and then they started trying to figure out why that was, because weather-wise it was the same storm,” Lindgren explained.

What Went Wrong > Miami Herald, December 20, 1992 > Page 1

“What Went Wrong > Miami Herald, December 20, 1992 > Page 1” (originally published Dec. 20, 1992). Flickr photo by Daniel X. O’Neil, licensed under CC BY 2.0

Further investigation unveiled that several different developers had been responsible for real estate construction in different regions. “And it led them to a conclusion and a very powerful piece of journalism showing that it had to do with the building standards of the different developers,” said Lindgren. “So that was one of the early uses of mapping and data journalism, showing what a useful tool it could be.”

As researchers raise questions about the skills and motivations that enable citizen engagement with open data and geospatial technologies, journalism schools are increasingly recognizing the need to integrate a formal understanding of data journalism into the curriculum.

At the 2014 Geothink Annual General Meeting, Lindgren met a fellow researcher with complementary interests—Marcy Burchfield, executive director of the Toronto-based Neptis Foundation. The aim of Neptis has been to apply the unique capabilities of mapping and spatial analysis to help decision makers and the public understand regional issues in the Greater Toronto Area. The Geothink encounter led to the development of a Neptis-led geodata workshop for senior-level students enrolled in Ryerson’s journalism school, exposing students to some statistics basics as well as the various challenges of working with spatial data to develop meaningful stories.

“Getting the data into a usable form, I think, is probably the biggest challenge technically for journalists,” said Lindgren. “Although the skills are rapidly increasing and we’re training our students to do that.”

At Ryerson, undergraduates are required to take an introductory digital journalism course that critically engages with social media and citizen journalism along with new forms of multimedia and alternative storytelling methods. A separate “visualizing facts” elective course aims to provide hands-on experience with various data visualization techniques including mapping, while reinforcing numeracy skills (something that, historically, journalists have not been known for).

Data’s fit for purpose?

CBC News Pledge to Vote Map

CBC News’s crowdsourced, interactive “Pledge to Vote” map, part of their 2015 Canada Votes coverage.

In recent years Canadian data journalists have garnered international attention both for their creative uses of geodata and their involvement in the push for open access to government information. “One of the big problems is the availability of data,” Lindgren said. “What’s available? How good is it? How hard do you have to fight for it? Is it really available through an open data source or do you have to go through Freedom of Information to get it?”

While increasingly media outlets are exploring the possibilities of engaging the public to create crowdsourced content by volunteering their geodata, the data sets that journalists tend to be interested in—ideally, data that can support rich, informative stories relevant to public interest—are not typically collected with the journalist in mind. In particular, government data sources have often been generated to support internal administrative needs, not to address transparency and accountability concerns per se. Data input decisions may not be documented, and agencies may “silently” post-process the information before distributing it to journalists or the greater public. This makes the process of learning how to clean up inconsistent, non-standardized data developed for a very different audience a particularly important skill for journalists to acquire. Only then can a journalist build an understanding of the data’s patterns and the stories they can support.

“You’re only as good as your data,” Lindgren emphasized. “In some ways the act of journalism allows you to test the data and see how good it is. Because the data may be telling you one thing, but then when you go out on the ground and you start interviewing and looking around you may find that what you’re seeing and hearing doesn’t seem to match what the data is telling you.

“So right away, as a journalist you’re going to be suspicious of that. And there are two places where this could be wrong. Either you’re talking to the wrong people or you’re not talking to a broad enough range of people—or there might be something wrong with the data.”

Verifying data accuracy is a time-honoured tradition

Lindgren shared the example of a colleague who was investigating the issue of slum landlords. The reporter asked the municipality to provide data on property standards complaints. Upon receiving and eventually mapping the data, the reporter and his colleagues made a surprising discovery. “They noticed that there was a section of the city that didn’t have any complaints. They thought that was odd, because they knew that there were a lot of rental areas and low-income areas there, with people living in somewhat vulnerable housing situations.”

Ultimately, the dissonance between the story on the ground and the story in the data led the reporter to go back to the city seeking further verification, and the nature of the problem soon revealed itself. It seems that a summer student had been in charge of aggregating and disseminating the data to the journalists when the information was requested, and that student had overlooked one section of the city.

While this particular story reflects human error during the communication phase rather than the data collection phase, Lindgren points out that the strong journalistic traditions of seeking verification and being suspicious of information sources puts the media in a unique position to evaluate data’s quality. “Verification is a fundamental element of journalism. That’s what we do that’s different from anybody who is just commenting out there online. The main issue is: is it verifiable, and what’s the public interest? That’s the starting point.”

Where public and private interests intersect

What constitutes “public interest” is a conversation that still needs to happen. The push for open data and the fact that personal information is increasingly accessible online has led parties both within and beyond government to raise concerns about how to strike the balance between privacy and transparency—and what the right balance may be.  Data sets often contain personal or identifying information. Cleansing the data of that information is not straightforward. Even when data appear on the surface anonymized, there are ever increasing opportunities to combine and process seemingly unrelated data sets in ways that can identify individuals and compromise personal information. As Geothink co-applicant researcher Teresa Scassa has addressed more than once in her work, this is not a theoretical problem but a reality that is already occurring.

Lindgren, however, said she does not see data journalism as giving rise to new types of ethical concerns for the media. “Obviously, a balance has to be struck. But the reality is that oftentimes the data is very generalized. It really depends on what the issue is and what the information is.

“The whole privacy issue is really a red flag, a lot of times, for journalists, because it can be used by governments as a pretext for not releasing information that governments just don’t want the public to know. The two reasons they don’t release information is privacy and violating commercial interests, and then the third reason is political consideration, but they can’t couch it in those terms.”

In terms of how journalists themselves strike that balance, Lindgren said this must be assessed on a case by case basis. “Basically, our job is invading people’s space, quite often. So we have to—and we do—make those judgment calls every day. The data is just another layer of that, or another area where we’d have to think about it and have those discussions.

“What it comes down to is you’re weighing, what’s the public interest in this information? There’s no hard and fast rule. It depends on what the information is.”

If you have any questions for April, reach her on Twitter here: @aprilatryerson

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Local News Research Project map of Toronto news coverage

Crosspost: How is your Toronto neighbourhood portrayed in the news? Check it out using these interactive maps

This post is cross-posted with permission from April Lindgren and Christina Wong at Local News Research Project. 

By April Lindgren and Christina Wong

Concerns about how neighbourhoods are portrayed in the news have surfaced regularly in the Toronto area over the years. But are those concerns valid?

Interactive maps produced by the The Local News Research Project (LNRP) at Ryerson University’s School of Journalism are designed to help Toronto residents answer this question. The maps give the public access to data the research project collected on local news coverage by the Toronto Star and the online news website OpenFile.ca. The maps can be used by members of the public and researchers to:

  • get an overall sense of where news in the city is – and isn’t – covered
  • compare patterns of local news coverage by two different news organizations
  • examine the city-wide geographic patterns of reporting on crime, entertainment and other major news topics
  • examine news coverage in each of Toronto’s 44 wards including how often the news stories and photographs reference locations in a ward
  • see what story topics are covered in each ward

The maps are based on the Toronto Star’s local news coverage published on 21 days between January and August, 2011. Researchers have found that a two-week sample of news is generally representative of news coverage over the course of a year (Riffe, Aust & Lacy, 1993). The data for OpenFile.ca, which suspended publishing in 2012, were collected for every day in 2011 between January and August.

Click here to see the maps or continue reading to find out more about news coverage and neighbourhood stereotyping, how the maps work, and the role of open data sources in this project.


Local news and neighbourhood stereotyping
The decision to explore news coverage of Toronto neighbourhoods was prompted by concerns expressed by citizens and local politicians about how certain parts of the city are portrayed in the local media. Residents were furious (Pellettier, Brawley & Yuen, 2013), for instance, when Toronto Star columnist Rosie Dimanno referred to the city’s Scarborough area as “Scarberia” in an article about former mayor Rob Ford’s re-election campaign (DiManno, 2013). Back in 2007, then-mayor David Miller went so far as to contact all of the city’s news media asking them to cite the nearest main intersection rather than reporting more generally that a particular crime occurred in Scarborough (Maloney, 2007). In Toronto’s west end, the local city councillor suggested negative connotations associated with the Jane and Finch neighbourhood could be diffused by renaming it University Heights, but the idea was vehemently rejected by residents (Aveling, 2009).

A study that investigated how Toronto’s most disadvantaged neighbourhoods were covered by the Toronto Star concluded that there was very little coverage of news in these communities (Lindgren, 2009). The study, which examined Toronto Star local news reporting in 2008, also found that crime tended to dominate the limited coverage that did take place and suggested the problem could be rectified not by ignoring crime stories, but by increasing coverage of other sorts of issues in those communities.


Exploring the maps
The interactive maps allow users to explore local news coverage in the City of Toronto. A sample of local stories and photographs from the Toronto Star (the local newspaper with the largest circulation in the city) and OpenFile.ca (a community-based news website) were identified and analyzed in 2011 to capture data about story topics and mentions of geographic locations.

These maps make the data available to the public in a way that allows users to explore and compare media coverage in different areas of the city. Users can zoom in on a neighbourhood and discover all of the locations referenced within a neighbourhood. Each point on the map represents a location that was referenced in one or more news items. Users can click on any of these points to see a list of news articles associated with each location (Figure 1).

Figure 1. Users can click each point to find out about the news articles that referenced the location
Figure 1. Users can click each point to find out about the news articles that referenced the location

By clicking within a ward boundary, users can also access a summary chart describing the breakdown by subject of all local news coverage in that ward. Users interested in the Scarborough area, for instance, can zoom into that area on the map and click on each Scarborough ward to see what sorts of stories (crime, transit, entertainment, sports, etc.) were reported on in that ward (Figure 2).

Figure 2. Users can click within a ward to access charts summarizing news coverage by topic
Figure 2. Users can click within a ward to access charts summarizing news coverage by topic

Users interested in how and where a particular news topic is covered can access separate interactive maps for the top five subjects covered by the two news sources. Figure 3, for example, shows all locations mentioned in crime and policing stories published by the Toronto Star during the study’s sample period.

Figure 3. Toronto Star coverage of crime and policing news
Figure 3. Toronto Star coverage of crime and policing news

The role of open data sources in creating these maps
A total of 23 pre-existing datasets were used to support the creation of these interactive maps including relevant open datasets that were publically available online in 2008. The datasets were used to populate a list of geographic locations in the GTA that had the potential to be referenced in local news stories. Each dataset was assigned unique numerical codes and all 23 datasets were appended to a geographic reference table that coders could search. The incorporated reference list of geographic locations and features allowed for a more accurate and efficient coding process: Coders entering information about spatial references in local news items were able to select many of the referenced geographic locations from the pre-populated list rather than entering the information manually. This improved accuracy because it helped prevent human error and also sped up the coding process.

We would have preferred to use more open data sources during the initial development of the database, but this wasn’t possible due to limited availability of datasets with the spatial attributes that make mapping possible. At that time, only two of the 23 datasets used (approximately 8.7% of the total) were available from open data sources in a format that included geography (such as shapefiles). Both files were obtained from the City of Toronto’s Open Data website. These limitations meant that the majority of the database relied on contributions from private data sources.

The situation has improved over time as more open government data become available in geographic file formats that support research with spatial analysis. As of mid-2015, six more of the 23 datasets (two federal, one provincial and three municipal) used in the database have become available. If we were creating the database today, a total of eight datasets or 34.8% of the initial database could be populated using open data sources (Table 1).

Table 1. Availability of open data sources
Available in 2008 when the database was created Currently available
Private sources 21 15
Government open data 2   (8.7% of database) 8 (34.8% of database)
Total # of datasets 23 23


Since 2008, the Government of Canada has launched its own open data portal, joined the Open Government Partnership alongside other countries supporting the release of open government data, and adopted the G8 Open Data Charter (Standing Committee on Government Operations and Estimates, 2014). Provincial and municipal governments have made similar improvements to open data access. The Government of Ontario launched an online open data catalogue in 2012 and is currently developing an Open Data Directive to be implemented later this year (Fraser, 2015). The City of Toronto introduced its open data portal in 2009 and developed an Open Data Policy in 2012 (City of Toronto, n.d.).

As Table 1 suggests, however, further improvements are required to reduce barriers to research and innovation. A report from the Standing Committee on Government Operations and Estimates, for instance, recommended that the federal government provide data at smaller levels of geography, work together with different levels of government to establish standards and release data, and provide a greater variety of open data to reflect all government departments. The report noted that the release of open data can improve government efficiency, foster citizen engagement, and encourage innovation (Standing Committee on Government Operations and Estimates, 2014). Academic researchers have argued that improvements in the availability of open government data would stimulate valuable research and outcomes with economic and social value (Jetzek, Avital & Bjorn-Andersen, 2014; Kucera, 2015; Zuiderwijk, Janssen & Davis, 2014). Journalists are also pushing for easier and greater access to data (Schoenhoff & Tribe, 2014).


Research conducted by the Local News Research Project was made possible by public funds and as such the data should be widely available. The interactive maps are an attempt to fulfill that obligation.

While the maps capture only a snapshot of news coverage at a fixed point in time, they nonetheless demonstrate the importance of geospatial analysis in local news research (Lindgren & Wong, 2012). They are also a powerful data visualization tool that allows members of the public to independently explore media portrayals of neighbourhoods and the extent to which some parts of a city are represented in the news while others are largely ignored.

Finally, this mapping project also illustrates how open government data can foster research and how much there is still to do in terms of making data available to the public in useful formats.


The Local News Research Project was established in 2007 to explore the role of local news in communities. Funding for this research has been provided by Ryerson University, CERIS-The Ontario Metropolis Centre and the Social Sciences and Humanities Research Council.

About the authors: Lindgren is an Associate Professor in Ryerson University’s School of Journalism and Academic Director of the Ryerson Journalism Research Centre. Christina Wong is a graduate of Ryerson University’s Geographic Analysis program. Initial work on the maps was done in 2014 by GEO873 students Cory Gasporatto, Lorenzo Haza, Eaton Howitt and Kevin Wink from Ryerson University’s Geographic Analysis program.



Avaling, N. (2009, January 8). Area now being called University Heights, but some call change a rejection of how far we’ve come. Toronto Star, p. A10.

City of Toronto. (n.d.). Open Data Policy. Retrieved from http://www1.toronto.ca/wps/portal/contentonly?vgnextoid=7e27e03bb8d1e310VgnVCM10000071d60f89RCRD

DiManno, R. (2013, July 6). Ford fest makes a strategic move. Toronto Star, p. A2.

Jetzek, T., Avital, M. & Bjorn-Andersen, N. (2014). Data-driven innovation through open government data. Journal of Theoretical and Applied Electronic Commerce Research, 9(2), 100-120.

Fraser, D. (2015, May 1). Ontario announces more open data, public input. St. Catharines Standard. Retrieved from http://www.stcatharinesstandard.ca/2015/05/01/ontario-announces-more-open-data-public-input

Kucera, J. (2015). Open government data publication methodology. Journal of Systems Integration, 6(2), 52-61.

Lindgren, A. (2009). News, geography and disadvantage: Mapping newspaper coverage of high-needs neighbourhoods in Toronto, Canada. Canadian Journal of Urban Research, 18(1), 74-97.

Lindgren, A. & Wong, C. (2012). Want to understand local news? Make a map. 2012 Journalism Interest Group proceedings. Paper presented at Congress 2012 of the Humanities and Social Sciences conference. Retrieved from http://cca.kingsjournalism.com/?p=169

Maloney, P. (2007, January 16). Mayor sticks up for Scarborough. Toronto Star. Retrieved from http://www.thestar.com/news/2007/01/16/mayor_sticks_up_for_scarborough.html?referrer=

Pellettier, A., Brawley, D. & Yuen, S. (2013, July 11). Don’t call us Scarberia [Letter to the editor]. Toronto Star. Retrieved from http://www.thestar.com/opinion/letters_to_the_editors/2013/07/11/dont_call_us_scarberia.html

Riffe, D., Aust, C. F. & Lacy, S. R. (1993). The effectiveness of random, consecutive day and constructed week sampling. Journalism Quarterly, 70, 133-139.

Schoenhoff, S. & Tribe, L. (2014). Canada continues to struggle in Newspapers Canada’s annual FOI audit [web log post]. Retrieved from https://cjfe.org/blog/canada-continues-struggle-newspapers-canada%E2%80%99s-annual-foi-audit

Standing Committee on Government Operations and Estimates. (2014). Open data: The way of the future: Report of the Standing Committee on Government Operations and Estimates. Retrieved from http://www.parl.gc.ca/content/hoc/Committee/412/OGGO/Reports/RP6670517/oggorp05/oggorp05-e.pdf

Zuiderwijk, A., Janssen, M. & Davis, C. (2014). Innovation with open data: Essential elements of open data ecosystems. Information Polity, 19(1, 2), 17-33.

Part 2: Our Project Head on North American Civic Participation and Geothink’s Projects

By Drew Bush

Renee Sieber, associate professor in McGill University’s Department of Geography and School of Environment.

Part 2 (of 2). This is the second in a two part series with the head of Geothink.ca, Renee Sieber, an associate professor in the Department of Geography and School of Environment at McGill University. In this second part, we pick up the story of how Sieber sees civic participation in North America during an age of technological change. Catch Part 1 here if you missed our coverage of Geothink itself; its vision, goal and design.

Talking with Renee Sieber means finding exuberance and excitement for each of Geothink.ca’s projects and the work of all the team members, collaborators and partners. One place to start such a conversation is with how many cities make information available to the public.

“Cities are also publishing enormous amounts of data—it’s called open data,” Sieber, Geothink’s head and an associate professor at McGill University, said. “And this data can be turned into applications that for example can allow citizens to more easily know when they should put their recycling out and what types of recycling [exist], where there is going to be traffic congestion or traffic construction, when the next city council meeting will be held and what will be on the city council agenda.”

This open data forms the basis for how many modern technologies use programs to simplify and facilitate citizen interactions with city garbage services, transportation networks or city policies and processes. In particular, one Geothink project aims to interrogate how standards are created for open data—no easy thing, according to Sieber, when you’re talking not just about abstract data but even more abstract metadata.

“So why should one care about that?” Sieber asked. “Well, we should care about that first of all because the reason that people can now get up-to-date transit information in cities all over North America and, indeed, cities all over the world is because of a very small open data standard called GTFS, the General Transit Feed Specification.”

This prototype successful standard (or way of structuring public transportation data) resulted from a partnership between Google and Portland, Oregon. And, according to Sieber, it’s not about visualizing the data but standardizing its structure so that it can be used in equations that allow cities to show when the next bus will arrive, the best ways to get from point x to point y, and to put all this information on a map. In fact, Open511, a standard for traffic and road construction, explicitly styles itself after this prototype.

“It’s really interesting for us to figure out what new data standards will emerge,” added Sieber. “For example, will there be one to show traffic construction all over the country or all over North America?”

Yet it marks only one way Geothink is examining citizen interactions with cities. At Ryerson University, Associate Professor Pamela Robinson is working on examining civic hackathons where cities bring together techies and interested citizens to find innovative ways to design and build applications for city data and improve city services. The problem, according to Sieber, is that after the hackathons many such applications or proofs of concepts disappear. For example, some recent winners of a hackathon in the United Kingdom felt that too many applications end up up in the back alleys of BitBucket or GitHub.

“So it can be a quite frustrating experience,” Sieber said. “And cities and the participants alike look towards ways to try to retain that enthusiasm over time and to build on the proofs of concept to actually deploy the apps. So Pamela is conducting research on how to create that technological sustainability.”

In yet one other project, Geothink has partnered with the Nova Scotia Government’s Community Counts  program located in Halifax, Nova Scotia to study the preferences of end-users from community-based management organizations or non-profits who utilize the open data from the province. Community Counts’s mission is to make it easier for such organizations to use information such as socio-demographic data, although the organization itself just lost funding in the province’s most recent budget.

“This is very different from working with apps from open data because with apps you generally know who the developers are but you don’t know who the end-users are,” Sieber said. “So we are conducting a project with them to ask questions of the end-users to find out what they find valuable or challenging in using data. And we’ll then infer that to the challenges and opportunities of working with open data that cities produce.”

So how does all this reflect on what civic participation means today in North America? Governments can now know if you visit certain parks, go to certain places for coffee, and meet certain friends while doing either. So, theoretically at least, they can now design urban spaces and cities themselves to be safer, more vibrant, and better suited to the range of activities taking place in these places.

“That seems both incredibly convenient and incredibly Orwellian at the same time,” Sieber said. To find out more about her views of civic participation, stay tuned for our next Geothoughts Podcast by signing up to receive it on iTunes.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Is Raw Data Bad For You? Open Data Obligations to Government.

By: Leah Cooke, Stephanie Piper, Alana Kingdon, and Peter Johnson

*This blog post was written collaboratively during the springtime Geothink meetup between Ryerson University and University of Waterloo students + faculty. The goals of this meetup were to discuss current and future issues related to Geothink research themes.

What strings are attached to governments that provide open data to citizens? Alongside the current interest in government open data, questions remain about how government should share data. Specifically, what obligations do government have beyond simple data provision. These obligations could include educating citizens, contextualizing data, and also being receptive to citizen feedback on the data provided. For example, if a government publishes drinking water quality data, do they have a (moral, ethical, operational) obligation to support this data with relevant contextualizing information? We propose five main responses that government could provide when answering this question.

1. Nothing

Providing the data as it exists without any contextual information to aid in understanding the data.

2. Metadata

Defining the details of data by including acronyms and field names etc., to make the document readable for technically adept users.

3. Processed data

Data that includes maps, legends, annotations, or graphs/charts to aid in the understanding of the data by viewers, while still including original data to allow for additional analyses.  Also included is descriptive information or explanatory text that may be helpful to user’s understanding of the data.

4. Engagement and Responsiveness:

A responsive format for the distribution of open data would see a commitment to the sustainability of the data itself, by ensuring updates and maintenance to open data portals.  An obligation for citizen engagement would also be present at this level, with governments creating workshops or tools to help citizens become knowledgeable about the data as well as ensuring two-way communication between those with questions or suggestions surrounding the data.

5.  Interoperable Standards for Data Sets

Data sets are released in a standardized format, with the intention of increasing the accessibility of data for novice users as well as for ease of integrating information from different municipalities for regional analyses.

While these five standards are different potential ways government can operationally structure and release their data, the question still remains: which format is ethically or morally the option that should be adopted. Further, government bodies have complex requirements to abide by legislation, including the Accessibility for Ontarians with Disabilities Act (AODA), that also need to be considered when releasing any information. Do these requirements alter these obligations?  Beyond the regulations themselves, further accessibility issues are also raised.  Should the data be accessible by various levels of users, from novice to expert?  What does this mean for the ethical framework surrounding the release of the data?  As data is often released in formats only recognized by technical users such as .csv files, is there an additional obligation to release data that is open to nontechnical users as well? Inherent in the name, open data is the assumption that this data is being released in order to create an increase in transparency. It would be natural to assume that this data should therefore be accessible to users regardless of their technical skill levels.

In conclusion, for municipal governments, providing raw data is really just the first step. Governments that are serious about using open data as a prelude or support to open government need to also provide tools and support to enable data being turned into information. Metadata is not enough, and open data does not replace targeted information and publications created internally and shared with citizens.