Category Archives: Academic Output

Local News Map Will Be First To Highlight Disparities in Coverage Across Canada

The Local News Map launched by Geothink Co-Applicant Jon Corbett and Partner April Lindgren asks Canadian communities to report how news coverage has changed for them.

The Local News Map launched by Geothink Co-Applicant Jon Corbett and Partner April Lindgren asks Canadian communities to report how news coverage has changed for them.

By Drew Bush

The impact of newsroom cutbacks, consolidations, and closures across Canada will be the focus of a new crowdsourced online geoweb map. The public can contribute to it now—with the full map available online this June.

“The idea of the map is it will allow us to gather data that we have not been able to gather on our own just because there is so much data out there,” said Geothink Partner April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre.

The project stems from a belief that Canadians who live in smaller cities, suburban municipalities, and rural areas typically have fewer media outlets to turn to for media coverage. For that reason, the project’s list of communities includes municipalities that have experienced a major disruption in local news sources (such as the closure of a daily newspaper or television station).

The map will be one part of the project “Election News, local information and community discourse: Is Twitter the new public sphere?” that is headed by Jaigris Hodson, an assistant professor of Interdisciplinary Studies at Royal Roads University. Geothink Co-Applicant Jon Corbett, an associate professor in Community, Culture and Global Studies at the University of British Columbia, Okanagan, helped design it with his graduate students in the Spatial Information for Community Engagement (SPICE Lab) using the GeoLive platform featured in previous Geothink research.

“What we did is we went back to 2008 and we tried to find all the instances where a local news organization had either closed or scaled back service or something new had been launched,” Lindgren said in March while the map was being developed. “And so we populated the map as much as possible with information that we could find. But obviously there is lots and lots of other information out there that has happened since 2008. And there is probably lots of stuff going on right now that we do not know about.”

“So the idea of the crowdsourcing is it will allow us to obviously draw upon the expertise and knowledge of the local news landscape of people who live in communities,” she added. “And they will be able to contribute those pieces of information to the map to make it more robust and comprehensive.”

The map can document gains, losses, service increases, and service reductions at local online, radio, television and newspaper outlets across the country. Now that the map is open to contributions, members of the public can add information about changes to the local news landscape in their own communities. The map’s administrators will verify user submitted content so that the map remains accurate.


For a closer look at this project and the map, check out our video. In it, Corbett walks the user through a step-by-step view of the map and how to contribute while Lindgren discusses the importance of this work.


Making the Map

Many researchers have looked at the critical information needs of communities on topics such as education, health, security and emergency responses, Lindgren said. This in turn led her to think about how we know if there is adequate media provision in Canadian communities and where media have been lost or added. Still another related question is what online local news sites or social media have sprung up to fill any missing gaps.

Through attendance at last year’s Geothink Annual General Meeting in Waterloo, Lindgren was put in touch with Corbett. Eight months later, they had created a beta version of the map that included a couple hundred entries. Some emerging trends in the data include the consolidation and closure of community newspapers in Quebec and British Columbia.

“April had this idea that she wanted to better communicate information about how news media had changed over the period of the last eight years or so in Canada,” Corbett says of his meeting last May with Lindgren that began work by his lab to develop the map. “Because there really has been a lot of activity. Some newspapers have gotten larger. Others have closed down. There is a general move to web based media.”

His group has spent months ironing out the technical details of making this map presentable and ready for launch. Lindgren has provided feedback and advice on it through each stage.

“It has been an awful lot more complicated than we originally intended precisely because there has been so much activity and there is so much difference in this type of activity across Canada,” Corbett added. “For example, we have four major types of media. We have newspaper, we have radio, we have TV, and we have the web. And then within each one of those different types, we have a whole series of other information we need to convey.”

For example, the newspaper category of the map alone contains free dailies, free weeklies, and paid newspapers. It also must contain a measure of how such types have either declined or increased in different localities through time.

“And so we see all of this sort of compounding levels of complexity around the data that we need to present,” he said. “Because of course one of the problems with maps is to present information in an effective way require an awful lot of thought about the types of information being presented and how you actually present that type of information. It needs to be beautiful, it needs to be engaging, but it also needs to be informative.”

Corbett’s group has used color, typography, and more to make the map easily accessible to users. But he notes it is still a challenge to display all the transformations from January 2008 to the present. And the issue of time—as it is portrayed in the map—will only become more important as users begin to use it to display events taking place during specific years.

Getting Involved

Lindgren and Corbett are both excited for the map’s launch and the public’s participation. Right now the map needs richer input on new online news sites launched in Canada, Lindgren said. This is an issue she plans to keep an eye on when users begin contributing in greater frequency to determine to what extent these organizations are viable and fill gaps left by the closure of local newspapers and television stations.

Lindgren also believes the map has wide appeal to specific communities including local governments, individual community members, and journalists. She points out that in coming weeks there is a number of ways for the public to get involved.

“First of all, when they add a piece of data, they can comment,” Lindgren said. “Or they can comment on any other developments on the map that they want. And we have also incorporated a survey so that people can fill out the survey and tell us a little bit about where they go for their local news. Whether they feel adequately informed about various topics ranging from politics to education to other local issues.”

In case you missed it in the links above, find the map here to contribute your information: https://localnewsmap.geolive.ca/

###

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Getting a Better Handle on Geosocial Data with Geothink Co-Applicant Robert Feick

 Images and text from sites like Flickr (the source of this image) provide geosocial data which University of Waterloo Associate Professor Robert Feick and his graduate students work to make more useful to planners and citizens.

Images and text from sites like Flickr (the source of this image) provide geosocial data that University of Waterloo Associate Professor Robert Feick and his graduate students work to make more useful to planners and citizens.

By Drew Bush

A prevailing view of volunteered geographic information (VGI) is that large datasets exist equally across North American cities and spaces within them. Such data should therefore be readily available for planners wishing to use it to aid in decision-making. In a paper published last August in Cartography and Geographic Information Science, Geothink Co-Applicant Rob Feick put this idea to the test.

He and co-author Colin Robertson tracked Flickr data across 481 urban areas in the United States to determine what characteristics of a given city space correspond to the most plentiful data sets. This research allowed Feick, an associate professor in the University of Waterloo’s School of Planning, to determine how representative this type of user generated data are across and within cities.

The paper (entitled Bumps and bruises in the digital skins of cities: Unevenly distributed user-generated content across U.S. urban areas) reports that coverage varies greatly between downtown cores and suburban spaces, as may be expected, but also that such patterns differ markedly between cities that appear similar in terms of size, function and other characteristics.

“Often it’s portrayed as if these large data resources are available everywhere for everyone and there aren’t any constraints,” he told Geothink.ca recently about this on-going research. Since these data sets are often repurposed to learn more about how people perceive places, this misconception can have clear implications for those working with such data sets, he added.

“Leaving aside all the other challenges with user generated data, can we take an approach that’s been piloted let’s say in Montreal and assume that’s it going to work as well in Hamilton, or Calgary, or Edmonton and so on?” he said. Due to variations in VGI coverage, tools developed in one local context may not produce the same results elsewhere in the same city or in other cities.

The actual types of data used in research like Feick’s can vary. Growing amounts of data from social media sites such as Flickr, Facebook, and Twitter, and transit or mobility applications developed by municipalities include geographic references. Feick and his graduate students work to transform such large datasets—which often include many irrelevant (and unruly) user comments or posts—into something that can be useful to citizens and city officials for planning and public engagement.

“My work tends to center on two themes within the overall Geothink project,” Feick said. “I have a longstanding interest in public engagement and participation from a GIS perspective—looking at how spatial data and tools condition and, hopefully, improve public dialogue. And the other broad area that I’m interested in is methods that help us to transform these new types of spatial data into information that is useful for governments and citizens.”

“That’s a pretty broad statement,” he added. “But in a community and local context, I’m interested in both understanding better the characteristics of these data sources, particularly data quality, as well as the methods we can develop to extract new types of information from large scale VGI resources.”

Applying this Research Approach to Canadian Municipalities

Much of Feick’s Geothink related research at University of Waterloo naturally involves work in the Canadian context of Kitchener, Waterloo, and the province of Ontario. He’s particularly proud of the work being done by his graduate students, Ashley Zhang and Maju Sadagopan. Both are undertaking projects that are illustrative of Feick’s above-mentioned two areas of research focus.

Many municipalities offer Web map interfaces that allow the public to place comments in areas of interest to them. Sadagopan’s work centres on providing a semi-automated approach for classifying these comments. In many cases, municipal staff have to read each comment and manually view where the comment was placed in order to interpret a citizen’s concerns.

Sadagopan is developing spatial database tools and rule-based logic that use keywords in comments as well as information about features (e.g. buildings, roads, etc.) near their locations to filter and classify hundreds of comments and identify issues and areas of common concern. This work is being piloted with the City of Kitchener using data from a recent planning study of the Iron Horse Trail that that runs throughout Kitchener and Waterloo.

Zhang’s work revolves around two projects that relate to light rail construction that is underway in the region of Waterloo. First, she is using topic modeling approaches to monitor less structured social media and filter data that may have relevance to local governments.

“She’s doing work that’s really focused on mining place-based and participation related information from geosocial media as well as other types of popular media, such as online newspapers and blogs, etc.,” Feick said. “She has developed tools that help to start to identify locales of concern and topics that over space and time vary in terms of their resonance with a community.”

“She’s moving towards the idea of changing public feedback and engagement from something that’s solely episodic and project related to something that could include also this idea of more continuous forms of monitoring,” he added.

To explore the data quality issues associated with VGI use in local governments, they are also working on a new project with Kitchener that will provide pedestrian routing services based on different types of mobility. The light rail project mentioned above has disrupted roadways and sidewalks with construction in the core area and will do so until the project is completed in 2017. Citizen feedback on the impacts of different barriers and temporary walking routes for people with different modes of mobility (e.g. use of wheelchairs, walkers, etc.) will be used to study how to gauge VGI quality and develop best practices for integrating public VGI into government data processes.

The work of Feick and his students provides important insight for the Geothink partnership on how VGI can be used to improve communication between cities and their citizens. Each of the above projects has improved service for citizens in Kitchener and Waterloo or enhanced the way in which these cities make and communicate decisions. Feick’s past projects and future research directions are similarly oriented toward practical, local applications.

Past Projects and Future Directions

Past projects Feick has completed with students include creation of a solar mapping tool for Toronto that showed homeowners how much money they might make from the provincial feed-in-tariff that pays for rooftop solar energy they provide to the grid. It used a model of solar radiation to determine the payoff from positioning panels on different parts of a homeowner’s roof.

Future research Feick has planned includes work on how to more effectively harness different sources of geosocial media given large data sizes and extraneous comments, further research into disparities in such data between and within cities, and a project with Geothink Co-Applicant Stéphane Roche to present spatial data quality and appropriate uses of open data in easy-to-understand visual formats.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Abstract of Paper mentioned in the above article:

Bumps and bruises in the digital skins of cities: Unevenly distributed user-generated content across U.S. urban areas
Abstract
As momentum and interest builds to leverage new user-generated forms of digital expression with geographical content, classical issues of data quality remain significant research challenges. In this paper we highlight the uneven textures of one form of user-generated data: geotagged photographs in U.S. urban centers as a case study into representativeness. We use generalized linear modeling to associate photograph distribution with underlying socioeconomic descriptors at the city-scale, and examine intra-city variation in relation to income inequality. We conclude with a detailed analysis of Dallas, Seattle, and New Orleans. Our findings add to the growing volume of evidence outlining uneven representativeness in user-generated data, and our approach contributes to the stock of methods available to investigate geographic variations in representativeness. We show that in addition to city-scale variables relating to distribution of user-generated content, variability remains at localized scales that demand an individual and contextual understanding of their form and nature. The findings demonstrate that careful analysis of representativeness at both macro and micro scales simultaneously can provide important insights into the processes giving rise to user-generated datasets and potentially shed light into their embedded biases and suitability as inputs to analysis.

 

Geothink at the 2016 Annual Meeting of the American Association of Geographers

By Drew Bush

From March 29 to April 2, 2016, Geothink’s students, co-applicants, and collaborators presented their research and met with colleagues at the now concluded 2016 Association of American Geographers (AAG) Annual Meeting in San Francisco, CA. Over the week, Geothinkers gave 11 presentations, organized six sessions, chaired five sessions, and were panellists on four sessions. See who attended here.

“This year’s AAG provided a great opportunity to get geographically diverse Geothinkers together,” Victoria Fast, a recently graduated doctoral student in Ryerson University’s Department of Geography and Environmental Studies, wrote in an e-mail to Geothink.ca. “I can’t think of a better place for a meeting about a special journal issue on open data; there are so many fresh, uncensored ideas flying around the conference, both inside and outside of sessions.”

Of particular note for Fast was Panel Session 1475 Gender & GIScience (see her Geothink.ca guest post here). Panelists in the session included Geothink Head Renee Sieber, associate professor in McGill University’s Department of Geography and School of Environment; And, Geothink collaborator Sarah Elwood, a professor in University of Washington’s Department of Geography.

Others agreed.

“A panel on gender and GIScience was refreshing and enlightening,” Geothink Co-Applicant Scott Bell, a professor of Geography and Planning at University of Saskatchewan, wrote to Geothink.ca.

“My presentation was in a day long symposium on human dynamism,” he added. “It summarized a recently published Geothink aligned paper on human mobility tracking and active transportation (published in the International Journal of Geographical Information Science). It seemed to go over pretty well, I’m glad I was in the day-long event as the room was packed most of the day.”

For others, the high cost of the location meant they couldn’t stay for a full week or attend every single session. Still they reported good turnout by members of the Geothink team.

“This year we did not organize a specific panel or panels, or specific sessions to showcase Geothink work,” wrote Geothink Co-Applicant Teresa Scassa, Canada Research Chair in Information Law and professor in the Faculty of Law at the University of Ottawa. “This meant that our presentations were dispersed across a variety of different sessions, on different days of the week.”

Many Geothinkers were also intimately involved in running parts of the conference.

“This was a standout AAG for me,” wrote Geothink researcher Alexander Aylett, a professor and researcher at the Institut national de la recherche scientifique, who ran three sessions (Find an overview of what Aylette’s sessions did at www.smartgreencities.org). In collaboration with Andrés Lluque-Ayla from Durham University we ran a full day of sessions on the overlap between “Smart” and “Sustainable” cities.   We had some excellent presentations—including one from fellow Geothinker Pamela Robinson—and a strong turn out throughout the whole day. (Even at 8 AM, which was a shock to me!).”

For some students, it was the first time they had attended the meeting or presented their own research.

“This was my first time at the AAG,” said Geothink Newsletter Editor, Suthee Sangiambut, a maser’s student in McGill University’s Department of Geography with Sieber. “I was quite excited to be at the event and was able to meet all kinds of geographers, all of whom had different ideas on what geography exactly is.”

“It was great to see how global events of the past years were shaping our discussions on the Geoweb, privacy, surveillance, national identity, immigration, and more,” he added. “Those at the Disrupt Geo session were able to hear perspectives from private sector and civil society sides, which was quite refreshing and is something I would like to see more of in the future.”

The AAG annual meeting has been held every year since the association’s founding in 1904. This year’s conference included more than 9,000 attendees.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca. We also want to thank Victoria Fast for her willingness to share photos from the 2016 AAG Annual Meeting.

Please find an abstract for the presentation mentioned in this article below.

Leveraging Sensor Networks to Study Human Spatial Behavior

Abstract:
In the past decade society has entered a technological period characterized by mobile and smart computing that supports input and processing from users, services, and numerous sensors. The smartphones that most of us carry in our pockets offer the ability to integrate input from sensors monitoring various external and internal sources (e.g., accelerometer, magnetometer, microphone, GPS, wireless internet, Bluetooth). These relatively raw inputs are processed on the phones to provide us with a seemingly unlimited number of applications. Furthermore, these raw inputs can be integrated and processed in ways that can offer novel representations of human behavior, both dissagregate and aggregate. As a result, new opportunities to examine and better understand human spatial behaviour are available. An application we report here involved monitoring of a group of people over an extended period of time. Monitoring is timed at relatively tightly spaced intervals (every 2 minutes). Such a research setting lends itself to both planned and natural experiments; the later of which emerge as a result of the regular and on going nature of data collection. We will report on both a natural experiment  and planned observations resulting from 3 separate implementations of our smartphone based observations. The natural experiment that emerged in the context of our most recent month-long monitoring study of 28 participants using mobile phone-based ubiquitous sensor monitoring will be our focus, but will be contextualized with related patterns from earlier studies. The implications for public health and transportation planning are discussed.

Inside Geothink’s Open Data Standards Project: Standards For Improving City Governance

By Rachel Bloom

Rachel Bloom is a McGill University undergraduate student and project lead for Geothink’s Open Data Standards Project.

In February, I led a Geothink seminar with city officials to introduce the results of our open data standards project we began approximately one year earlier. The project was started with the objective of assisting municipal publishers of open data in standardizing their datasets. We presented two spreadsheets: the first was dedicated to evaluating ‘high-value’ open datasets published by Canadian municipalities and the second consisted of an inventory of open data standards applicable to these types of datasets.

Both spreadsheets enable our partners who publish open data to know what standards exist and who uses them for which datasets. The project I lead is motivated by the idea that well-developed data standards for city governance can grant us the luxury of not having to think about the compatibility of technological components. When we screw in a new light bulb or open a web document we assume that it will work with the components we have (Guidoin and McKinney 2012). Technology, whether it refers to information systems or manufactured goods, relies on standards to ensure its usability and dissemination.

Municipal governments that publish open data look to the importance of standards for improving the usability of their data. Unfortunately, even though ‘high-value’ datasets have increasingly become available to the public, these datasets currently lack a consensus about how they should be structured and specified. Such datasets include crime statistics and annual budget data that can provide new services to citizens when municipalities open such datasets by publishing them to their open data catalogues online. Anyone can access such datasets and use the data however they wish without restriction.

Civic data standards provide agreements about semantic and schematic guidelines for structuring and encoding the data. Data standards specify technical data elements such as file formats, data schemas, and unique identifiers to make civic data interoperable. For example, most datasets are published in CSV or XML formats. CSV structures the data in columns and rows, while XML encapsulates the data in a hierarchical tree of <tags>.

They also specify common vocabularies in order to clarify interpretation of the data’s meanings. Such vocabularies could include, for example, definitions for categories of expenditure in annual budget data. Geothink’s Open Data Standards Project offers publishers of open data an opportunity to improve the usability and efficiency of their data for consumers. This makes it easier to share data across municipalities because the technological components and their meanings within systems will be compatible.

Introducing Geothink’s Open Data Standards Project
No single, clear definition of an open data standard exists. In fact, most definitions of an ‘open data standard’ follow two prevailing ideas: 1) Standards for open data; 2) And, open standards for data. Geothink’s project examines and relates together both of these prevailing ideas (Table 1). The first spreadsheet, the ‘Adoption of Open Data Standards By Cities’, considers open data and its associated data standards. The second spreadsheet, the ‘Inventory of Open Data Standards,’ considers the process of open standardization. In other words, we were curious about what standards are currently being applied to open municipal data, and how to break down and document open standards for data in a way that is useful to municipalities looking to standardize their open data.

Table 1: Differences between ‘open data’ standards and open ‘data standards’

Requires open data Requires open standard process
Evaluation of ‘High-Value’ Datasets Yes No
Inventory of Open Data Standards No Yes

The project’s evaluation of datasets relates to standards for open data. Standards for open data refer to standards that, regardless of how they are developed and maintained, can be applied to open data. Open data, according to the Open Knowledge Foundation (2014), consists of raw digital data that should be freely available to anyone to use, repurposable and re-publishable as users wish, and absent mechanisms of control like restrictive licenses. However, the process of developing and maintaining standards for open data may not require transparency nor include public appeals for its development.

To discover what civic data standards are currently being used, the first spreadsheet, Adoption of Open Data Standards By Cities, evaluates ‘high value’ datasets specific to 10 domains (categories of datasets such as crime, transportation or or service requests) in the open data catalogues for the cities of Vancouver, Toronto, Surrey, Edmonton and Ottawa. The types of data were chosen based on the Open Knowledge Foundation’s choice of datasets considered to provide greatest utility for the public. The project’s spreadsheet notes salient structuring and vocabulary of each dataset; such as the name, file format, schema, and available metadata. It especially notes which data standards these five municipalities are using for their open data (if any at all).

With consultation from municipal bodies and organizations dedicated to publishing open data, we developed a second spreadsheet, Inventory and Evaluation of Open Data Standards,  that catalogues and evaluates 22 open data standards that are available for domain-specific data. The rows of this spreadsheet indicate individual data standards. The columns of this spreadsheet evaluate background information and quality for achieving optimal interoperability for each of the listed standards. Evaluating the quality of the standard’s performance, such as whether the standard is transferable to multiple jurisdictions, is an important consideration for municipalities looking to optimally standardize their data. Examples of open data standards in this inventory are BLDS for building permit data and the Budget Data Package for annual budget data.

The project’s second spreadsheet is concerned with open standards for data. Open standards, as opposed to closed standards, requires a collaborative, transparent, and consensus-driven process to maintain its development (Palfrey and Gasser, 2012). Therefore, open standards honor a commitment to processes of transparency, due process, and rights of appeal. Similarly to open data, open standards resist processes of unchecked, centralized control (Russell, 2014) . Open data standards make sure that end users do not get locked into a specific technology. In addition, because open standards are driven by consensus, they are developed according to the needs and interests of participatory stakeholders. While we provide spreadsheets on both, our project advocates implementing open standards for open data.

In light of the benefits of open standardization, the metrics of the second spreadsheet note the degree of openness for each standard. Such indicators of openness include multi-stakeholder participation and a consensus-driven process. Openness may be observed through the presence of online forums to discuss suggestions or concerns regarding the standard’s development and background information about each standard’s publishers. In addition, open standards use open licenses that dictate the standards may be used without restriction and repurposable for any use. Providing this information not only allows potential implementers to be aware of what domain-specific standards exist, but also allows them to gauge how well the standard performs in terms of optimal interoperability and openness.

Finally, an accompanying white paper explains the two spreadsheets and the primary objective of my project for both publishers and consumers of open data. In particular, it explains the methodology, justifies chosen evaluations, and notes the project’s results.  In addition, this paper will aid in navigating and understanding both of the project’s spreadsheets.

Findings from this Project
My work on this project has led me to conclude that the majority of municipally published open datasets surveyed do not use civic data standards. The most common standard used by municipalities in our survey was the General Transit Feed Specification (GTFS) for transit data and the Open311 API for service request data. Because datasets across cities and sectors vary formats and structure, differences in them coupled with a lack of cohesive definitions for labeling indicate standardization across cities will be a challenging undertaking. Publishers aiming to extend data shared among municipalities would benefit from collaborating and agreeing on standards for domain-specific data (as is the case with GTFS).

Our evaluation of 22 domain-specific data standards also shows standards do exist across a variety of domains. However, some domains, such as budget data, contain more open data standards than others. Therefore, potential implementers of standards must reconcile which domain-specific standard best fits their objectives in publishing the data and providing the most benefits for public good.

Many of standards also contain information for contacting the standard’s publishers along with online forums for concerns or suggestions. However, many still full information regarding their documentation or are simply in early draft stages. This means that although standards exist, some of these standards are in their early stages and may not be ready for implementation.

Future Research Pathways
This project has room for growth so that we can better our partners who publish and use open data decide how to go about adopting standards. To accomplish this goal, we could add more cities, domains, and open standards to the spreadsheets. In addition, any changes made to standards or datasets in the future must be updated.

In terms of the inventory of open data standards, it might be beneficial to separate metrics that evaluate openness of a standard from metrics that evaluate interoperability of a standard. Although we have emphasized the benefits of open standardization in this project, it is evident that some publishers of data do not perceive openness as crucial for the successfulness of a data standard in achieving optimal interoperability.

As a result, my project does not aim to dictate how governments implement data standards. Instead, we would like to work with municipalities to understand what is valued within the decision-making process to encourage adoption of specific standards. We hope this will allow us to provide guidance on such policy decisions. Most importantly, to complete such work, we ask Geothink’s municipal partners for input on factors that influence the adoption of a data standard in their own catalogues.

Contact Rachel Bloom at rachel.bloom@mail.mcgill.ca with comments on this article or to provide input on Geothink’s Open Data Standards Project.

References
Guidoin, Stéphane and James McKinney. 2012. Open Data, Standards and Socrata. Available at http://www.opennorth.ca/2012/11/22/open-data-standards.html. November 22, 2012.
Open Knowledge. Open Definition 2.0. Opendefinition.org. Retrieved 23 October 2015, from http://opendefinition.org/od/2.0/en/
Palfrey, John Gorham, and Urs Gasser. Interop: The promise and perils of highly interconnected systems. Basic Books, 2012.
Russell, Andrew L. Open Standards and the Digital Age. Cambridge University Press, 2014.

Geothink Program Guide for the American Association of Geographers (AAG) 2016 Annual Meeting

By Drew Bush

The Annual Meeting of the American Association of Geographers will be in San Francisco, CA from March 29 to April 2.

The Annual Meeting of the American Association of Geographers will be in San Francisco, CA from March 29 to April 2.

A large number of Geothinkers will be presenting at this year’s American Association of Geographers (AAG) Annual Meeting in San Francisco, CA the last week in March. You won’t want to miss two of our co-applicants and one of our students making presentations on Tuesday in the 10:00 AM session Data in action: Tracing the open data experiment. Other highlights include Renee Sieber and Sarah Elwood as panellists in Gender & GIScience.

Below we’ve compiled the schedule for all of Geothink’s co-applicants, collaborators and students who will be presenters, panelists, organizers, and chairs during the conference. Find a PDF of this program here. We hope you find this useful for finding the right sessions to join. You can also find the full searchable preliminary AAG program here.

If you’re not able to make the conference, you can follow along on Twitter and use our list of Twitter handles below to join the conversation with our participants.

Join the Conversation on Twitter:
Alex Aylett: @openalex_                   Peter Johnson: @peterajohnson
Tenille Brown: @TenilleEBrown      Pamela Robinson: @pjrplan
Jonathan Corbett: @joncorbett      Teresa Scassa: @teresascassa
Sarah Elwood: @SarahElwood1      Renee Sieber: @RE_Sieber
Victoria Fast: @VVFast                      Suthee Sangiambut: @notgregorypeck
Sara Harrison: @Sara_Harrison79  Scott Bell: @scottyBgeo
Stéphane Roche: @Geodoc31

And remember to use the conference hashtag #AAG2016 and our hashtag #Geothink or handle @geothinkca when you Tweet.

Come to our Sessions at AAG 2015:

Tuesday, March 29

Wednesday, March 30

Thursday, March 31

Friday, April 1

Saturday, April 2

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Image Source: Zheng Zeng, Creative Commons 4.0

Spotlight on Recent Publications: Open Data and Official Language Regimes

screenshotCanadian  open government website

The bilingual federal Open Government portal

By Naomi Bloch

Teresa Scassa is a Geothink co-applicant researcher and Canada Research Chair in Information Law at the University of Ottawa. In a recently published paper, Scassa and co-author Niki Singh consider some of the challenges that arise for open data initiatives operating in multilingual regions. The authors use Canada’s federal open data initiative as a case study to examine how a government body in an officially bilingual jurisdiction negotiates its language obligations in the execution of its open data plan.

The article points out two areas for potential concern. First, private sector uses of government data could result in the unofficial outsourcing of services that otherwise would be the responsibility of government agencies, “thus directly or indirectly avoiding obligations to provide these services in both official languages.” Second, the authors suggest that the push to rapidly embrace an open data ethos may result in Canada’s minority language communities being left out of open data development and use opportunities.

According to Statistics Canada’s 2011 figures, approximately 7.3 million people — or 22 percent of the population — reported French as their primary language in Canada. This includes over a million residents outside of Quebec, primarily in Ontario and New Brunswick. Canada’s federal agencies are required to serve the public in both English and French. This obligation is formalized within Canada’s Charter of Rights and Freedoms, as well as the Official Languages Act. Government departments are provided with precise guidelines and frameworks  to ensure that they comply with these regulatory requirements in all of their public dealings and communications.

Scassa and Singh reviewed the various components of the federal open data initiative since the launch of the program to determine how well it is observing bilingual requirements. The authors note that while the open data infrastructure as a whole largely adheres to bilingual standards, one departure is the initiative’s Application Programming Interface (API). An API provides a set of protocols and tools for software developers. In this case, the API supports automated calls for open data housed in government databases. According to the authors, “As this open source software is not developed by the federal government, no bilingualism requirements apply to it.” While professional developers may be accustomed to English software environments even if they are francophones, the authors point out that this factor presents an additional barrier for French-language communities who might wish to use open data as a civic tool.

In their analysis of the data portal’s “apps gallery,” Scassa and Singh observed that the majority of apps or data tools posted thus far are provided by government agencies themselves. These offerings are largely bilingual. However, at the time of the authors’ review, only four of the citizen-contributed apps supported French. In general, public contributions to the federal apps gallery are minimal compared to government-produced tools.

As part of their analysis, the authors also looked at the two Canadian Open Data Experience (CODE) hackathon events sponsored by the government in order to promote civic engagement with open data. Communications leading up to the events were provided in English and French. Government documentation also indicated strong participation from Quebec coders at the CODE hackathons, though native language of the coders is not indicated. Interestingly, the authors note, “In spite of the bilingual dimensions of CODE it has produced apps that are for the most part, English only.”

The 2015 event, which was sponsored by government but organized by a private company, had a bilingual website and application process. However, Scassa and Singh found that social media communications surrounding the event itself were primarily in English, including government tweets from the Treasury Board Secretariat. Given this, the authors question whether sufficient effort was made to attract French-Canadian minorities outside of Quebec, and if specific efforts may be needed to gauge and support digital literacy in these minority communities.

While it is still early days for Canada’s open data initiative, this case study serves to highlight the challenges of supporting an open data platform that can meet both legal obligations and broader ethical objectives. The authors conclude that, “In a context where the government is expending resources to encourage the uptake and use of open data in these ways, the allocation of these resources should explicitly identify and address the needs of both official language communities in Canada.”

Abstract

The open data movement is gathering steam globally, and it has the potential to transform relationships between citizens, the private sector and government. To date, little or no attention has been given to the particular challenge of realizing the benefits of open data within an officially bi- or multi-lingual jurisdiction. Using the efforts and obligations of the Canadian federal government as a case study, the authors identify the challenges posed by developing and implementing an open data agenda within an officially bilingual state. Key concerns include (1) whether open data initiatives might be used as a means to outsource some information analysis and information services to an unregulated private sector, thus directly or indirectly avoiding obligations to provide these services in both official languages; and (2) whether the Canadian government’s embrace of the innovation agenda of open data leaves minority language communities underserved and under-included in the development and use of open data.

Reference: Scassa, T., & Singh, Niki. (2015). Open Data and Official Language Regimes: An Examination of the Canadian Experience. Journal of Democracy & Open Government, 7(1), 117–133.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Geothink Student Evan Hamilton Explores Canadian Municipal Open Data and the Role of Journalism

headshot of Evan Hamilton

Geothink student Evan Hamilton recently defended his master’s thesis on Toronto data journalists’ use of open data.

By Naomi Bloch

Data journalists are some of the most active users of government open data in Canada. In his recently defended thesis, Evan Hamilton, a master’s student in the University of Toronto’s Faculty of Information, examined the role of data journalists as advocates, users, and producers of open data.

Hamilton’s thesis, titled “Open for reporting: An exploration of open data and journalism in Canada,” addressed four research questions:

  1. Are open data programs in Ontario municipalities developing in a way that encourages effective business and community development opportunities?
  2. How and why do journalists integrate open data in reporting?
  3. What are the major challenges journalists encounter in gaining access to government data at the municipal level?
  4. How does journalism shape the open data development at both the policy level and the grassroots level within a municipality?

To inform his work, Hamilton conducted in-depth, semi-structured interviews with three key data journalists in the City of Toronto: Joel Eastwood at the Toronto Star, William Wolfe-Wylie at the CBC, and Patrick Cain at Global News. While open data is often touted as a powerful tool for fostering openness and transparency, in his paper Hamilton notes that there is always the risk that “the rhetoric around open data can also be employed to claim progress in public access, when in fact government-held information is becoming less accessible.”

In an interview with Geothink, Hamilton explained that the journalists made important distinctions between the information currently available on Canadian open data portals and the information they typically seek in order to develop compelling, public-interest news stories. “One of the big things I took away from my interviews was the differentiation that journalists made between Freedom of Information and open data,” said Hamilton. “They were using them for two completely different reasons. Ideally, they would love to have all that information available on open data portals, but the reality is that the portals are just not as robust as they could be right now. And a lot of that information does exist, but unfortunately journalists have to use Freedom of Information requests to get it, which is a process that can take a lot of time and not always lead to the best end result.”

Legal provisions at various levels of government allow Canadians to make special Freedom of Information requests to try to access public information that is not readily available by other means. A nominal fee is usually charged. In Toronto, government agencies generally need to respond to such requests within 30 days. Even so, government responses do not always result in the provision of usable data, and if journalists request large quantities of information, departments have the right to extend the 30-day response time. For journalists, a delay of even a few days can kill a story.

While the journalists Hamilton interviewed recognized that open data portals were limited by a lack of resources, there was also a prevailing opinion that many government agencies still prefer to vet and protect the most socially relevant data. “Some were very skeptical of the political decisions being made,” Hamilton said. “Like government departments are intentionally trying to prevent access to data on community organizations or data from police departments looking at crime statistics in specific areas, and so they’re not providing it because it’s a political agenda.”

Data that helps communities

In his thesis, Hamilton states that further research is needed to better understand the motivations behind government behaviours. A more nuanced explanation involves the differing cultures within specific municipal institutions. “The ones that you would expect to do well, do do well, like the City of Toronto’s Planning and Finance departments,” Hamilton said. “Both of them provide really fantastic data that’s really up-to-date, really useful and accessible. They have people you can talk to if you have questions about the data. So those departments have done a fantastic job. It’s just having all the other departments catch up has been a larger issue.”

An issue of less concern to the journalists Hamilton consulted is privacy. The City’s open data policy stresses a balance between appropriate privacy protection mechanisms and the timely release of information of public value. Hamilton noted that in Toronto, the type of information currently shared as open data poses little risk to individuals’ privacy. At the same time, the journalists he spoke with tended to view potentially high-risk information such as crime data as information for which public interest should outweigh privacy concerns.

Two of the three journalists stressed the potential for data-driven news stories to help readers better understand and address needs in their local communities. According to Hamilton’s thesis, “a significant factor that prevents this from happening at a robust level is the lack of data about marginalized communities within the City.”

The journalists’ on-the-ground perspective echoes the scholarly literature, Hamilton found. If diverse community voices are not involved in the development of open data policies and objectives, chances for government efforts to meet community needs are hampered. Because of their relative power, journalists do recognize themselves as representing community interests. “In terms of advocacy, the journalists identify themselves as open data advocates just because they have been the ones pushing the city for the release of data, trying to get things in a usable format, and creating standard processes,” Hamilton said. “They feel they have that kind of leverage, and they act as an intermediary between a lot of groups that don’t have the ability to get to the table during negotiations and policy development. So they’re advocating for their own interests, but as they fulfill that role they’re advocating for marginalized communities, local interest groups, and people who can’t get to the table.”

Policy recommendations

Hamilton’s research also pointed to ways in which data journalists can improve their own professional practices when creating and using open data. “There needs to be more of a conversation between journalists about what data journalism is and how you can use open data,” Hamilton said. “When I talked to them, there was not a thing like, ‘Any time you use a data set in your story you cite the data set or you provide a link to it.’ There’s no standard practice for that in the industry, which is problematic, because then they’re pulling numbers out of nowhere and they’re trusting that you’ll believe it. If you’re quoting from a data set you have to show exactly where you’re getting that information, just like you wouldn’t anonymize a source needlessly.”

While Hamilton concentrated on building a picture of journalists’ open data use in the City of Toronto, his findings resulted in several policy recommendations for government agencies more broadly. First, Hamilton stressed that “as a significant user group, journalists need to be consulted in a formal setting so that open data platforms can be better designed to target their specific needs.” This is necessary, according to Hamilton, in order to permit journalists to more effectively advocate on behalf of their local communities and those who may not have a voice.

Another recommendation is aimed at meeting the needs of open data users who have different levels of competency. Although he recognizes the challenges involved, in his concluding chapter Hamilton writes, “Municipal governments need to allocate more resources to open data programs if they are going to be able to fulfill the needs of both a developer class requiring technical specifications, and a general consumer class that requires tools (for example. visualizations and interactives) to consume the data.”

Finally, Hamilton recommends that municipalities engage in more formal efforts “to combat internal culture in municipal departments that are against publishing public information. Data should be viewed as a public service, and public data should be used in the public interest.”

If you have any questions for Evan, reach him on Twitter here: @evanhams


Evan Hamilton successfully defended his Master of Information thesis on September 29 at the Faculty of Information, University of Toronto. His work was supervised by Geothink co-applicant researcher Leslie Regan Shade, associate professor in the University of Toronto’s Faculty of Information. Other committee members included University of Toronto’s Brett Caraway and Alan Galey (chair), as well as April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

 Abstract

This thesis describes how open data and journalism have intersected within the Canadian context in a push for openness and transparency in government collected and produced data. Through a series of semi-structured interviews with Toronto-based data journalists, this thesis investigates how journalists use open data within the news production process, view themselves as open data advocates within the larger open data movement, and use data-driven journalism in an attempt to increase digital literacy and civic engagement within local communities. It will evaluate the challenges that journalists face in gathering government data through open data programs, and highlight the potential social and political pitfalls for the open data movement within Canada. The thesis concludes with policy recommendations to increase access to government held information and to promote the role of data journalism in a civic building capacity.

Reference: Hamilton, Evan. (2015). Open for reporting: An exploration of open data and journalism in Canada (MI thesis). University of Toronto.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Spotlight on Recent Publications: Critical Reflections on Outcomes from Three Geoweb Partnerships

ACME_2015By Naomi Bloch

Exploring university–community partnerships

Participatory geospatial technologies have the potential to support and promote citizen engagement. This great promise has led to more collaborations between academics and community partners interested in pursuing this aim. In their recently published paper, “A web of expectations: Evolving relationships in community participatory geoweb projects,” four Geothink researchers and their colleagues cast a reflective eye on the participatory action research processes behind three completed geoweb partnership projects.

Co-author Jon Corbett, an associate professor in Community, Culture and Global Studies at the University of British Columbia’s Okanagan campus, sees their ACME journal article as helping to fill a gap in the geoweb literature.  “For me, one of the things I’m most interested in is how—in a truthful and well-positioned way—we can talk about the veracity of the work that we’ve done in regards to its ability to actually bring about impact and social change,” Corbett said.
In the article, the authors compare the different cases in order to consider some of the tangible, empirical challenges that the projects encountered, concentrating on the frictions that can occur where technical and social considerations intersect.

screenshot of local food map interface

Central Okanagan Community Food Map interface

Participatory geoweb initiatives commonly rely on out-of-the-box mapping tools. For these three projects, a central aim was to employ the expertise of the university researchers to co-develop and co-evaluate custom geospatial web tools that could address community partners’ objectives. Ideally, such collaborations can benefit all parties. Researchers can learn about the potential and the limitations of the geoweb as a tool for civic engagement while partners have the opportunity to reflect on their objectives and access a wider tool set for accomplishing them. In reality, collaborations require compromises and negotiations. The question then becomes: when are researchers’ academic objectives and partners’ community objectives truly complementary?

In the first case study, the geoweb was used to create a participatory business promotion website for a rural Quebec community, intended as one component of a larger regional economic development strategy. The second case was a collaboration between two university partners and a cultural heritage organization in Ontario. The partners hoped the customized online tool could “serve as a ‘living’ repository of cultural heritage information that was both accessible to the public and could facilitate the contribution of knowledge from the public.” In the third project, university researchers worked with government and grassroots organizations at local as well as provincial levels. The vision in this case was to enable non-expert community members in the Okanagan region to share their own knowledge and experiences about local food and its availability.

Corbett explained that in reflecting on their work, the researchers realized that as social scientists with very specific domains of expertise in political science, geographic information systems, and community research, “the types of skills we needed to negotiate the relationships were far different from the sorts of traditional disciplinary fields that we work in.”  Their collaborators tended to identify the academics more as technical consultants than scholars. As the authors write, “most academics remain untrained in software development, design, marketing, long-term application management and updating, legal related issues, [and] terms of service.”

Although the three collaborations were quite different in terms of the publics involved as well as the negotiated objectives of the projects and the tools employed to achieve them, the authors identified several key common themes. The authors note, “In all three case studies, we found that the process of technology development had substantial influence on the relationship between university developers and community organization partners. This influence was seen in the initial expectations of community partners, differential in power between researcher and community, sustainability of tools and collaborations, and the change from research collaboration towards ‘deal making.'”

In the end, Corbett said, “All of the projects were extremely precarious in how we could assign value or success to them. The paper was really an academic reflection on the outcomes of those three different projects.”

Abstract

New forms of participatory online geospatial technology have the potential to support citizen engagement in governance and community development. The mechanisms of this contribution have predominantly been cast in the literature as ‘citizens as sensors’, with individuals acting as a distributed network, feeding academics or government with data. To counter this dominant perspective, we describe our shared experiences with the development of three community-based Geospatial Web 2.0 (Geoweb) projects, where community organizations were engaged as partners, with the general aim to bring about social change in their communities through technology development and implementation. Developing Geoweb tools with community organizations was a process that saw significant evolution of project expectations and relationships. As Geoweb tool development encountered the realities of technological development and implementation in a community context, this served to reduce organizational enthusiasm and support for projects as a whole. We question the power dynamics at play between university researchers and organizations, including project financing, both during development and in the long term. How researchers managed, or perpetuated, many of the popular myths of the Geoweb, namely that it is inexpensive and easy to use (thought not to build, perhaps) impacted the success of each project and the sustainability of relationships between researcher and organization. Ultimately, this research shows the continuing gap between the promise of online geospatial technology, and the realities of its implementation at the community level.

Reference: Johnson, Peter A, Jon Corbett, Christopher Gore, Pamela J Robinson, Patrick Allen, and Renee E Sieber. A web of expectations: Evolving relationships in community participatory geoweb projects. ACME: An International E-Journal for Critical Geographies, 2015, 14(3), 827-848.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.