Moving Forward with Canadian Census Data

By Naomi Bloch


Chloropleth maps of National Household Survey global non-response data at the dissemination-area level courtesy Scott Bell. Global non-response rates > 50% resulted in suppression of data for that spatial unit. All maps are classified using a quantile classification scheme.


As we move forward (and backward) with the 2016 return of Canada’s long-form census, questions remain for everyone who uses Statistics Canada’s key socio-economic data. Researchers, local government agencies, community organizations, and industry will still need to use data collected via the 2011 National Household Survey and understand how to reconcile that information with long-form census data.

Concerns regarding the reliability of NHS data stem from the lower response rates that resulted from the non-mandatory nature of the 2011 survey. The overall response rate for the survey decreased from 94 percent in 2006 to 69 percent in 2011. Media attention has centred on the fact that Statistics Canada chose not to release survey data for 25 percent of all census subdivisions because response rates for those spatial units were too low. A key question is whether the regions for which we have no reliable data share certain socio-economic characteristics — and if so, how this might impact service provision.

Geothink co-applicant researcher Scott Bell, a professor of Geography and Planning at University of Saskatchewan, has been studying and mapping the spatial patterns of the National Household Survey’s global non-response rates. His work examines various geographic levels, and considers response rate patterns relative to several socio-economic variables. Bell found that across the 15 cities he studied, there are many commonalities between areas where response rates are similar.

In this video interview, Bell discusses his research and its implications.

For more from Scott Bell, see also: The Long-term Impacts of the Short-Lived National Household Survey

If you have thoughts or questions about this video interview, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Geothink Student Evan Hamilton Explores Canadian Municipal Open Data and the Role of Journalism

headshot of Evan Hamilton

Geothink student Evan Hamilton recently defended his master’s thesis on Toronto data journalists’ use of open data.

By Naomi Bloch

Data journalists are some of the most active users of government open data in Canada. In his recently defended thesis, Evan Hamilton, a master’s student in the University of Toronto’s Faculty of Information, examined the role of data journalists as advocates, users, and producers of open data.

Hamilton’s thesis, titled “Open for reporting: An exploration of open data and journalism in Canada,” addressed four research questions:

  1. Are open data programs in Ontario municipalities developing in a way that encourages effective business and community development opportunities?
  2. How and why do journalists integrate open data in reporting?
  3. What are the major challenges journalists encounter in gaining access to government data at the municipal level?
  4. How does journalism shape the open data development at both the policy level and the grassroots level within a municipality?

To inform his work, Hamilton conducted in-depth, semi-structured interviews with three key data journalists in the City of Toronto: Joel Eastwood at the Toronto Star, William Wolfe-Wylie at the CBC, and Patrick Cain at Global News. While open data is often touted as a powerful tool for fostering openness and transparency, in his paper Hamilton notes that there is always the risk that “the rhetoric around open data can also be employed to claim progress in public access, when in fact government-held information is becoming less accessible.”

In an interview with Geothink, Hamilton explained that the journalists made important distinctions between the information currently available on Canadian open data portals and the information they typically seek in order to develop compelling, public-interest news stories. “One of the big things I took away from my interviews was the differentiation that journalists made between Freedom of Information and open data,” said Hamilton. “They were using them for two completely different reasons. Ideally, they would love to have all that information available on open data portals, but the reality is that the portals are just not as robust as they could be right now. And a lot of that information does exist, but unfortunately journalists have to use Freedom of Information requests to get it, which is a process that can take a lot of time and not always lead to the best end result.”

Legal provisions at various levels of government allow Canadians to make special Freedom of Information requests to try to access public information that is not readily available by other means. A nominal fee is usually charged. In Toronto, government agencies generally need to respond to such requests within 30 days. Even so, government responses do not always result in the provision of usable data, and if journalists request large quantities of information, departments have the right to extend the 30-day response time. For journalists, a delay of even a few days can kill a story.

While the journalists Hamilton interviewed recognized that open data portals were limited by a lack of resources, there was also a prevailing opinion that many government agencies still prefer to vet and protect the most socially relevant data. “Some were very skeptical of the political decisions being made,” Hamilton said. “Like government departments are intentionally trying to prevent access to data on community organizations or data from police departments looking at crime statistics in specific areas, and so they’re not providing it because it’s a political agenda.”

Data that helps communities

In his thesis, Hamilton states that further research is needed to better understand the motivations behind government behaviours. A more nuanced explanation involves the differing cultures within specific municipal institutions. “The ones that you would expect to do well, do do well, like the City of Toronto’s Planning and Finance departments,” Hamilton said. “Both of them provide really fantastic data that’s really up-to-date, really useful and accessible. They have people you can talk to if you have questions about the data. So those departments have done a fantastic job. It’s just having all the other departments catch up has been a larger issue.”

An issue of less concern to the journalists Hamilton consulted is privacy. The City’s open data policy stresses a balance between appropriate privacy protection mechanisms and the timely release of information of public value. Hamilton noted that in Toronto, the type of information currently shared as open data poses little risk to individuals’ privacy. At the same time, the journalists he spoke with tended to view potentially high-risk information such as crime data as information for which public interest should outweigh privacy concerns.

Two of the three journalists stressed the potential for data-driven news stories to help readers better understand and address needs in their local communities. According to Hamilton’s thesis, “a significant factor that prevents this from happening at a robust level is the lack of data about marginalized communities within the City.”

The journalists’ on-the-ground perspective echoes the scholarly literature, Hamilton found. If diverse community voices are not involved in the development of open data policies and objectives, chances for government efforts to meet community needs are hampered. Because of their relative power, journalists do recognize themselves as representing community interests. “In terms of advocacy, the journalists identify themselves as open data advocates just because they have been the ones pushing the city for the release of data, trying to get things in a usable format, and creating standard processes,” Hamilton said. “They feel they have that kind of leverage, and they act as an intermediary between a lot of groups that don’t have the ability to get to the table during negotiations and policy development. So they’re advocating for their own interests, but as they fulfill that role they’re advocating for marginalized communities, local interest groups, and people who can’t get to the table.”

Policy recommendations

Hamilton’s research also pointed to ways in which data journalists can improve their own professional practices when creating and using open data. “There needs to be more of a conversation between journalists about what data journalism is and how you can use open data,” Hamilton said. “When I talked to them, there was not a thing like, ‘Any time you use a data set in your story you cite the data set or you provide a link to it.’ There’s no standard practice for that in the industry, which is problematic, because then they’re pulling numbers out of nowhere and they’re trusting that you’ll believe it. If you’re quoting from a data set you have to show exactly where you’re getting that information, just like you wouldn’t anonymize a source needlessly.”

While Hamilton concentrated on building a picture of journalists’ open data use in the City of Toronto, his findings resulted in several policy recommendations for government agencies more broadly. First, Hamilton stressed that “as a significant user group, journalists need to be consulted in a formal setting so that open data platforms can be better designed to target their specific needs.” This is necessary, according to Hamilton, in order to permit journalists to more effectively advocate on behalf of their local communities and those who may not have a voice.

Another recommendation is aimed at meeting the needs of open data users who have different levels of competency. Although he recognizes the challenges involved, in his concluding chapter Hamilton writes, “Municipal governments need to allocate more resources to open data programs if they are going to be able to fulfill the needs of both a developer class requiring technical specifications, and a general consumer class that requires tools (for example. visualizations and interactives) to consume the data.”

Finally, Hamilton recommends that municipalities engage in more formal efforts “to combat internal culture in municipal departments that are against publishing public information. Data should be viewed as a public service, and public data should be used in the public interest.”

If you have any questions for Evan, reach him on Twitter here: @evanhams


Evan Hamilton successfully defended his Master of Information thesis on September 29 at the Faculty of Information, University of Toronto. His work was supervised by Geothink co-applicant researcher Leslie Regan Shade, associate professor in the University of Toronto’s Faculty of Information. Other committee members included University of Toronto’s Brett Caraway and Alan Galey (chair), as well as April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

 Abstract

This thesis describes how open data and journalism have intersected within the Canadian context in a push for openness and transparency in government collected and produced data. Through a series of semi-structured interviews with Toronto-based data journalists, this thesis investigates how journalists use open data within the news production process, view themselves as open data advocates within the larger open data movement, and use data-driven journalism in an attempt to increase digital literacy and civic engagement within local communities. It will evaluate the challenges that journalists face in gathering government data through open data programs, and highlight the potential social and political pitfalls for the open data movement within Canada. The thesis concludes with policy recommendations to increase access to government held information and to promote the role of data journalism in a civic building capacity.

Reference: Hamilton, Evan. (2015). Open for reporting: An exploration of open data and journalism in Canada (MI thesis). University of Toronto.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

The Long-term Impacts of the Short-lived National Household Survey

Tweet

By Naomi Bloch

On November 5, Navdeep Bains, Canada’s new Minister of Innovation, Science and Economic Development (now that’s a mouthful!) confirmed the rumours that the country’s mandatory long-form census will be reinstated in 2016.

But what are the long-term consequences of the interruption in mandatory data collection caused by 2011’s National Household Survey (NHS)? How significant is this short-lived census change likely to be?

Geothink co-applicant researcher Scott Bell, a professor of Geography and Planning at University of Saskatchewan, has been studying and mapping the spatial patterns of the voluntary National Household Survey data, comparing global non-response rates in metropolitan and non-metropolitan areas across the country. After today’s official announcement, Bell shared a few preliminary thoughts based on his research.

profile photo of Scott Bell

Scott Bell is a professor in the Department of Geography and Planning at the University of Saskatchewan.

Geothink: Now that the mandatory long-form census has been re-instated, is there anything that researchers or others who rely on location-based specifics from census data need to keep mind?

Scott Bell: In my own research I have been relying on 2006 data for much longer than I would have if the 2011 survey had been the long form of the census. The NHS misrepresents different parts (and types of parts) of the country. In my analysis of 15 Canadian cities, there were lower response rates (measured by non-response) in places with low income, aboriginal populations, new immigrants, and lower rates of education. This is quite troubling since the only solution Stats Canada had at their disposal was over sampling in such areas, which might exacerbate the bias.

Geothink: Did you find that your own recent research was impacted by the 2011 data, and are there likely to be any long-term implications for researchers, given that just one survey period was affected?

Scott Bell: Yes, I was compelled to use long-form data from 2006. It is a relief that we will have a return of this data for 2016. I have always appreciated Canada’s five-year census cycle and a 10-year wait is going to be OK, this once. But there will be a persistent problem trying to understand our society between 2011 and 2016 that won’t be true of another five-year period. Our understanding of economics, household mobility, finances, and structure, immigration, education, etc. for the period from 2011 to 2016 is diminished.

Geothink: Are there any important considerations to keep in mind, for those integrating data from 2011 and other periods?

Scott Bell: In work I hope to publish in the next six months, patterns of response (actually non-response) and what social and economic variables predict this non-response will be elucidated. The next step might be the development of tools to adjust NHS values in order to make the data collected more reliable. The most important step in this direction will be the collection of the long form in 2016; that data will be useful in establishing estimates of what 2011 values are valid and perhaps allow for the setting of “correction factors” for egregious rates of non-response.

Stay tuned for more detailed insights from Scott Bell on location-specific considerations of the National Household Survey data, coming soon to Geothink.ca.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Spotlight on Recent Publications: Critical Reflections on Outcomes from Three Geoweb Partnerships

ACME_2015By Naomi Bloch

Exploring university–community partnerships

Participatory geospatial technologies have the potential to support and promote citizen engagement. This great promise has led to more collaborations between academics and community partners interested in pursuing this aim. In their recently published paper, “A web of expectations: Evolving relationships in community participatory geoweb projects,” four Geothink researchers and their colleagues cast a reflective eye on the participatory action research processes behind three completed geoweb partnership projects.

Co-author Jon Corbett, an associate professor in Community, Culture and Global Studies at the University of British Columbia’s Okanagan campus, sees their ACME journal article as helping to fill a gap in the geoweb literature.  “For me, one of the things I’m most interested in is how—in a truthful and well-positioned way—we can talk about the veracity of the work that we’ve done in regards to its ability to actually bring about impact and social change,” Corbett said.
In the article, the authors compare the different cases in order to consider some of the tangible, empirical challenges that the projects encountered, concentrating on the frictions that can occur where technical and social considerations intersect.

screenshot of local food map interface

Central Okanagan Community Food Map interface

Participatory geoweb initiatives commonly rely on out-of-the-box mapping tools. For these three projects, a central aim was to employ the expertise of the university researchers to co-develop and co-evaluate custom geospatial web tools that could address community partners’ objectives. Ideally, such collaborations can benefit all parties. Researchers can learn about the potential and the limitations of the geoweb as a tool for civic engagement while partners have the opportunity to reflect on their objectives and access a wider tool set for accomplishing them. In reality, collaborations require compromises and negotiations. The question then becomes: when are researchers’ academic objectives and partners’ community objectives truly complementary?

In the first case study, the geoweb was used to create a participatory business promotion website for a rural Quebec community, intended as one component of a larger regional economic development strategy. The second case was a collaboration between two university partners and a cultural heritage organization in Ontario. The partners hoped the customized online tool could “serve as a ‘living’ repository of cultural heritage information that was both accessible to the public and could facilitate the contribution of knowledge from the public.” In the third project, university researchers worked with government and grassroots organizations at local as well as provincial levels. The vision in this case was to enable non-expert community members in the Okanagan region to share their own knowledge and experiences about local food and its availability.

Corbett explained that in reflecting on their work, the researchers realized that as social scientists with very specific domains of expertise in political science, geographic information systems, and community research, “the types of skills we needed to negotiate the relationships were far different from the sorts of traditional disciplinary fields that we work in.”  Their collaborators tended to identify the academics more as technical consultants than scholars. As the authors write, “most academics remain untrained in software development, design, marketing, long-term application management and updating, legal related issues, [and] terms of service.”

Although the three collaborations were quite different in terms of the publics involved as well as the negotiated objectives of the projects and the tools employed to achieve them, the authors identified several key common themes. The authors note, “In all three case studies, we found that the process of technology development had substantial influence on the relationship between university developers and community organization partners. This influence was seen in the initial expectations of community partners, differential in power between researcher and community, sustainability of tools and collaborations, and the change from research collaboration towards ‘deal making.'”

In the end, Corbett said, “All of the projects were extremely precarious in how we could assign value or success to them. The paper was really an academic reflection on the outcomes of those three different projects.”

Abstract

New forms of participatory online geospatial technology have the potential to support citizen engagement in governance and community development. The mechanisms of this contribution have predominantly been cast in the literature as ‘citizens as sensors’, with individuals acting as a distributed network, feeding academics or government with data. To counter this dominant perspective, we describe our shared experiences with the development of three community-based Geospatial Web 2.0 (Geoweb) projects, where community organizations were engaged as partners, with the general aim to bring about social change in their communities through technology development and implementation. Developing Geoweb tools with community organizations was a process that saw significant evolution of project expectations and relationships. As Geoweb tool development encountered the realities of technological development and implementation in a community context, this served to reduce organizational enthusiasm and support for projects as a whole. We question the power dynamics at play between university researchers and organizations, including project financing, both during development and in the long term. How researchers managed, or perpetuated, many of the popular myths of the Geoweb, namely that it is inexpensive and easy to use (thought not to build, perhaps) impacted the success of each project and the sustainability of relationships between researcher and organization. Ultimately, this research shows the continuing gap between the promise of online geospatial technology, and the realities of its implementation at the community level.

Reference: Johnson, Peter A, Jon Corbett, Christopher Gore, Pamela J Robinson, Patrick Allen, and Renee E Sieber. A web of expectations: Evolving relationships in community participatory geoweb projects. ACME: An International E-Journal for Critical Geographies, 2015, 14(3), 827-848.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Journalism: Storytelling in the Geodata Age

By Naomi Bloch

The rise of more accessible geospatial web tools along with expanding sources of open data have fostered a potent—if somewhat techno-utopian—civic vision. For those immersed in understanding this new digital landscape, one question often surfaces: who’s truly putting these resources to use?

The most reliable answer is perhaps an obvious one. “Journalists are making huge use of mapping and geodata for storytelling, for the visualization of stories, and for investigative reporting purposes,” said April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

As a scholar, Lindgren’s own research employs data mapping techniques to examine the geography of news coverage and the role of Canadian media in society. “Maps have actually been quite a powerful tool for us to explore patterns of local news and understand how it works. It opened up a whole new way of getting at and understanding the data because we were able to visualize it.

“Before that, it was the old problem of columns and reams of numbers” Lindgren said. “But being able to map it allowed us to show geographically, yes, most of the news coverage is focused on downtown Toronto. So why is that? And what are the implications of not doing much coverage in other areas of the city? And furthermore, we mapped the types of topics. So what does it mean when most of the news that they publish about certain areas is crime coverage? What does that do in terms of the geographic stereotyping?”

Computer-assisted reporting revisited

Lindgren notes that the use of mapping and data analysis for actual journalistic purposes is not a new phenomenon. Over twenty years ago, in 1993, Miami Herald research editor Steve Doig won a Pulitzer Prize for his investigative coverage of Hurricane Andrew’s aftermath in Florida. The year prior, Doig and his colleagues spent several intensive months processing and evaluating two data sets—one that helped to map out property damage caused by the hurricane and another documenting wind speeds at different locations and times throughout the storm. “They noticed from using mapping that the damage was much more extensive in certain areas than in others, and then they started trying to figure out why that was, because weather-wise it was the same storm,” Lindgren explained.

What Went Wrong > Miami Herald, December 20, 1992 > Page 1

“What Went Wrong > Miami Herald, December 20, 1992 > Page 1” (originally published Dec. 20, 1992). Flickr photo by Daniel X. O’Neil, licensed under CC BY 2.0

Further investigation unveiled that several different developers had been responsible for real estate construction in different regions. “And it led them to a conclusion and a very powerful piece of journalism showing that it had to do with the building standards of the different developers,” said Lindgren. “So that was one of the early uses of mapping and data journalism, showing what a useful tool it could be.”

As researchers raise questions about the skills and motivations that enable citizen engagement with open data and geospatial technologies, journalism schools are increasingly recognizing the need to integrate a formal understanding of data journalism into the curriculum.

At the 2014 Geothink Annual General Meeting, Lindgren met a fellow researcher with complementary interests—Marcy Burchfield, executive director of the Toronto-based Neptis Foundation. The aim of Neptis has been to apply the unique capabilities of mapping and spatial analysis to help decision makers and the public understand regional issues in the Greater Toronto Area. The Geothink encounter led to the development of a Neptis-led geodata workshop for senior-level students enrolled in Ryerson’s journalism school, exposing students to some statistics basics as well as the various challenges of working with spatial data to develop meaningful stories.

“Getting the data into a usable form, I think, is probably the biggest challenge technically for journalists,” said Lindgren. “Although the skills are rapidly increasing and we’re training our students to do that.”

At Ryerson, undergraduates are required to take an introductory digital journalism course that critically engages with social media and citizen journalism along with new forms of multimedia and alternative storytelling methods. A separate “visualizing facts” elective course aims to provide hands-on experience with various data visualization techniques including mapping, while reinforcing numeracy skills (something that, historically, journalists have not been known for).

Data’s fit for purpose?

CBC News Pledge to Vote Map

CBC News’s crowdsourced, interactive “Pledge to Vote” map, part of their 2015 Canada Votes coverage.

In recent years Canadian data journalists have garnered international attention both for their creative uses of geodata and their involvement in the push for open access to government information. “One of the big problems is the availability of data,” Lindgren said. “What’s available? How good is it? How hard do you have to fight for it? Is it really available through an open data source or do you have to go through Freedom of Information to get it?”

While increasingly media outlets are exploring the possibilities of engaging the public to create crowdsourced content by volunteering their geodata, the data sets that journalists tend to be interested in—ideally, data that can support rich, informative stories relevant to public interest—are not typically collected with the journalist in mind. In particular, government data sources have often been generated to support internal administrative needs, not to address transparency and accountability concerns per se. Data input decisions may not be documented, and agencies may “silently” post-process the information before distributing it to journalists or the greater public. This makes the process of learning how to clean up inconsistent, non-standardized data developed for a very different audience a particularly important skill for journalists to acquire. Only then can a journalist build an understanding of the data’s patterns and the stories they can support.

“You’re only as good as your data,” Lindgren emphasized. “In some ways the act of journalism allows you to test the data and see how good it is. Because the data may be telling you one thing, but then when you go out on the ground and you start interviewing and looking around you may find that what you’re seeing and hearing doesn’t seem to match what the data is telling you.

“So right away, as a journalist you’re going to be suspicious of that. And there are two places where this could be wrong. Either you’re talking to the wrong people or you’re not talking to a broad enough range of people—or there might be something wrong with the data.”

Verifying data accuracy is a time-honoured tradition

Lindgren shared the example of a colleague who was investigating the issue of slum landlords. The reporter asked the municipality to provide data on property standards complaints. Upon receiving and eventually mapping the data, the reporter and his colleagues made a surprising discovery. “They noticed that there was a section of the city that didn’t have any complaints. They thought that was odd, because they knew that there were a lot of rental areas and low-income areas there, with people living in somewhat vulnerable housing situations.”

Ultimately, the dissonance between the story on the ground and the story in the data led the reporter to go back to the city seeking further verification, and the nature of the problem soon revealed itself. It seems that a summer student had been in charge of aggregating and disseminating the data to the journalists when the information was requested, and that student had overlooked one section of the city.

While this particular story reflects human error during the communication phase rather than the data collection phase, Lindgren points out that the strong journalistic traditions of seeking verification and being suspicious of information sources puts the media in a unique position to evaluate data’s quality. “Verification is a fundamental element of journalism. That’s what we do that’s different from anybody who is just commenting out there online. The main issue is: is it verifiable, and what’s the public interest? That’s the starting point.”

Where public and private interests intersect

What constitutes “public interest” is a conversation that still needs to happen. The push for open data and the fact that personal information is increasingly accessible online has led parties both within and beyond government to raise concerns about how to strike the balance between privacy and transparency—and what the right balance may be.  Data sets often contain personal or identifying information. Cleansing the data of that information is not straightforward. Even when data appear on the surface anonymized, there are ever increasing opportunities to combine and process seemingly unrelated data sets in ways that can identify individuals and compromise personal information. As Geothink co-applicant researcher Teresa Scassa has addressed more than once in her work, this is not a theoretical problem but a reality that is already occurring.

Lindgren, however, said she does not see data journalism as giving rise to new types of ethical concerns for the media. “Obviously, a balance has to be struck. But the reality is that oftentimes the data is very generalized. It really depends on what the issue is and what the information is.

“The whole privacy issue is really a red flag, a lot of times, for journalists, because it can be used by governments as a pretext for not releasing information that governments just don’t want the public to know. The two reasons they don’t release information is privacy and violating commercial interests, and then the third reason is political consideration, but they can’t couch it in those terms.”

In terms of how journalists themselves strike that balance, Lindgren said this must be assessed on a case by case basis. “Basically, our job is invading people’s space, quite often. So we have to—and we do—make those judgment calls every day. The data is just another layer of that, or another area where we’d have to think about it and have those discussions.

“What it comes down to is you’re weighing, what’s the public interest in this information? There’s no hard and fast rule. It depends on what the information is.”

If you have any questions for April, reach her on Twitter here: @aprilatryerson

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Crosspost: Being Philosophical About Crowdsourced Geographic Information

This Geo: Geography and Environment blog post is cross-posted with permission from the authors, Renée Sieber (McGill University, Canada) and Muki Haklay (University College London, UK).
By Renée Sieber and Muki Haklay

Our recent paper, The epistemology(s) of volunteered geographic information: a critique, started from a discussion we had about changes within the geographic information science (GIScience) research communities over the past two decades. We’ve both been working in the area of participatory geographic information systems (GIS) and critical studies of geographic information science (GIScience) since the late 1990s, where we engaged with people from all walks of life with the information that is available in GIS. Many times we’d work together with people to create new geographic information and maps. Our goal was to help reflect their point of view of the world and their knowledge about local conditions, not always aim for universal rules and principles. For example, the image below is from a discussion with the community in Hackney Wick, London, where individuals collaborated to ensure the information to be captured represented their views on the area and its future, in light of the Olympic works that happened on their doorstep. The GIScience research community, by contrast, emphasizes quantitative modelling and universal rules about geographic information (exemplified by frequent mentioning of Tobler’s first law of Geography). The GIScience research community was not especially welcoming of qualitative, participatory mapping efforts, leaving these efforts mostly in the margins of the discipline.

Photo of 2007 participatory mapping contributors working together in Hackney Wick, London, 2007

Participatory Mapping in Hackney Wick, London, 2007

Around 2005, researchers in GIScience started to notice that when people used their Global Positioning System (GPS) devices to record where they took pictures or used online mapping apps to make their own maps, they were generating a new kind of geographic information. Once projects like OpenStreetMap and other user-generated geographic information came to the scene, the early hostility evaporated and volunteered geographic information (VGI) or crowdsourced geographic information was embraced as a valid, valuable and useful source of information for GIScience research. More importantly, VGI became an acceptable research subject, with subjects like how to assess quality and what motivates people to contribute.

This about-face was puzzling and we felt that it justified an investigation of the concepts and ideas that allowed that to happen. Why did VGI become part of the “truth” in GIScience? In philosophical language, the questions ‘where does knowledge come from? how was it created? What is the meaning and truth of knowledge?’ is known as epistemology and our paper evolved into an exploration of the epistemology, or more accurately the multiple epistemologies, which are inherent in VGI. It’s easy to make the case that VGI is a new way of knowing the world, with (1) its potential to disrupt existing practices (e.g. the way OpenStreetMap provide alternative to official maps as shown in the image below) and (2) the way VGI both constrains contributions (e.g., 140 chars) and opens contributions (e.g., with its ease of user interface; with its multimedia offerings). VGI affords a new epistemology, a new way of knowing geography, of knowing place. Rather than observing a way of knowing, we were interested in what researchers thought was the epistemology of VGI. They were building it in real-time and attempting to ensure it conformed to existing ways of knowing. An analog would be: instead of knowing a religion from the inside, you construct your conception of it, with your own assumptions and biases, while you are on the outside. We argue that construction was occurring with VGI.

OpenStreetMap mapping party (Nono Fotos)

OpenStreetMap mapping party (Nono Fotos)

We likewise were interested in the way that long-standing critics of mapping technologies would respond to new sources of data and new platforms for that data. Criticism tends to be grounded in the structuralist works of Michel Foucault on power and how it is influenced by wider societal structures. Critics extended traditional notions of volunteerism and empowerment to VGI, without necessarily examining whether or not these were applicable to the new ‘ecosystem’ of geospatial apps companies, code and data. We also were curious why the critiques focussed on the software platforms used to generate the data (e.g., Twitter) instead of the data themselves (tweets). It was as if the platforms used to create and share VGI are embedded in various socio-political and economic configurations. However, the data were innocent of association with the assemblages. Lastly, we saw an unconscious shift in the Critical GIS/GIScience field from the collective to the personal. Historically, in the wider field of human geography, when we thought of civil society mapping together by using technology, we looked at collective activities like counter-mapping (e.g., a community fights an extension to airport runway by conducting a spatial analysis to demonstrate the adverse impacts of noise or pollution to the surrounding geography). We believe the shift occurred because Critical GIS scholars were never comfortable with community and consensus-based action in the first place. In hindsight, it probably is easier to critique the (individual) emancipatory potential as opposed to the (collective) empowerment potential of the technology. Moreover, Critical GIS researchers have shifted their attention away from geographic information systems towards the software stack of geospatial software and geosocial media, which raises question about what is considered under this term. For all of these reasons and more we decided to investigate the “world building” from both the instrumentalist scientists and from their critics.

We do use some philosophical framing—Borgmann has a great idea called the device paradigm—to analyse what is happening, and we hope that the paper will contribute to the debate in the critical studies of geographical information beyond the confines of GIScience to human geography more broadly.

About the authors: Renée E. Sieber is an Associate Professor in the Department of Geography and the School of Environment at McGill University. Muki Haklay is Professor of Geographical Information Science in the Department of Civil, Environmental and Geomatic Engineering at University College London.

Geothink Student Twitter Chat on Location and Privacy on the Geoweb

Laura Garcia, PhD student at the University of Ottawa under Prof. Elizabeth Judge (University of Ottawa), recently conducted a Spanish language Twitter chat with students at Los Andes University.

Discussion revolved around privacy issues especially in location-based services on the Geoweb 2.0. Using the hashtag #locationmine, participants discussed how location is both ‘mine’ in the sense of being very personal and private information and a mine of data to be exploited. Protecting privacy requires education, laws, regulation, and maybe even changes to technologies (such as the creation of standards). We are in the midst of changes in the technological landscape that are already having an effect on the amount of privacy internet users can realistically have, and this will continue into the future. Not only is technology changing, our habits are also changing as well, resulting in many agreeing to terms of use without a proper examination or thought over the details. Locational privacy must be debated and defined as a response to changes in the ecosystem, to enable proper regulation and protection of rights.

Laura presented the discussants with five conclusions:

  1. One of the most important elements of the right to privacy is for the user to have control over the information shared and who has access this information
  2. It is not easy to find and/or remove the collection of geographic information made automatically by some technologies and companies. Therefore, in these cases the user does not have control over the collection of their locational information
  3. It is important for the users of the Geoweb to take an active role in the protection of their privacy
  4. Better regulations are needed. These need to be mandatory and unambiguous
  5. Civil society needs to advocate for its own rights and demand corporate social responsibility

View the chat transcript below.

Geothoughts 7: Unpacking the Current and Future Value of Open Civic Data

Geothink researcher Peter Johnson and his students have been working with government partners across the country to examine the state of civic open data projects in Canada.

Geothink researcher Peter Johnson and his students have been working with government partners across the country to examine the state of civic open data projects in Canada.

By Naomi Bloch

Peter Johnson image

Peter Johnson, assistant professor in the University of Waterloo Department of Geography and Environmental Management, was recently awared Ontario’s Young Researcher Award.

Geothink co-applicant researcher Peter A. Johnson is an assistant professor of Geography and Environmental Management at the University of Waterloo. Johnson and his students have been working with Geothink government partners across the country to examine the state of civic open data projects in Canada. In our latest podcast, he discusses how the seemingly desirable ethos of open data may nonetheless hamper our understanding of how end users are interacting with government products.

In their July article published in Government Information Quarterly, Johnson and Geothink head Renee Sieber discuss what they see as the dominant models—and related challenges—of civic open data today. The authors suggest that these models may carry potentially conflicting motivations. Governments can distribute data and leave it to users to discover and determine data’s value, they may aim to track civic issues in ways that are cost efficient, or they may also try to support market innovation via data provision and the promotion of crowd-sourced contributions. On the other hand, open data efforts also have the potential to enable productive and empowering two-way civic interactions when motivated by non-economic imperatives.

What future directions will government data provision take? That may depend a lot on the choices that government agencies—and end users—make today.

 

If you have thoughts or questions about this podcast, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Reference
Sieber, R. E., & Johnson, P. A. (2015). Civic open data at a crossroads: Dominant models and current challenges, Government Information Quarterly, 32(3), pp. 308-315. doi:10.1016/j.giq.2015.05.003. OR: View pre-print copy.

Abstract
As open data becomes more widely provided by government, it is important to ask questions about the future possibilities and forms that government open data may take. We present four models of open data as they relate to changing relations between citizens and government. These models include; a status quo ‘data over the wall’ form of government data publishing, a form of ‘code exchange’, with government acting as an open data activist, open data as a civic issue tracker, and participatory open data. These models represent multiple end points that can be currently viewed from the unfolding landscape of government open data. We position open data at a crossroads, with significant concerns of the conflicting motivations driving open data, the shifting role of government as a service provider, and the fragile nature of open data within the government space. We emphasize that the future of open data will be driven by the negotiation of the ethical-economic tension that exists between provisioning governments, citizens, and private sector data users.