Category Archives: Academic Output

Crosspost: Being Philosophical About Crowdsourced Geographic Information

This Geo: Geography and Environment blog post is cross-posted with permission from the authors, Renée Sieber (McGill University, Canada) and Muki Haklay (University College London, UK).
By Renée Sieber and Muki Haklay

Our recent paper, The epistemology(s) of volunteered geographic information: a critique, started from a discussion we had about changes within the geographic information science (GIScience) research communities over the past two decades. We’ve both been working in the area of participatory geographic information systems (GIS) and critical studies of geographic information science (GIScience) since the late 1990s, where we engaged with people from all walks of life with the information that is available in GIS. Many times we’d work together with people to create new geographic information and maps. Our goal was to help reflect their point of view of the world and their knowledge about local conditions, not always aim for universal rules and principles. For example, the image below is from a discussion with the community in Hackney Wick, London, where individuals collaborated to ensure the information to be captured represented their views on the area and its future, in light of the Olympic works that happened on their doorstep. The GIScience research community, by contrast, emphasizes quantitative modelling and universal rules about geographic information (exemplified by frequent mentioning of Tobler’s first law of Geography). The GIScience research community was not especially welcoming of qualitative, participatory mapping efforts, leaving these efforts mostly in the margins of the discipline.

Photo of 2007 participatory mapping contributors working together in Hackney Wick, London, 2007

Participatory Mapping in Hackney Wick, London, 2007

Around 2005, researchers in GIScience started to notice that when people used their Global Positioning System (GPS) devices to record where they took pictures or used online mapping apps to make their own maps, they were generating a new kind of geographic information. Once projects like OpenStreetMap and other user-generated geographic information came to the scene, the early hostility evaporated and volunteered geographic information (VGI) or crowdsourced geographic information was embraced as a valid, valuable and useful source of information for GIScience research. More importantly, VGI became an acceptable research subject, with subjects like how to assess quality and what motivates people to contribute.

This about-face was puzzling and we felt that it justified an investigation of the concepts and ideas that allowed that to happen. Why did VGI become part of the “truth” in GIScience? In philosophical language, the questions ‘where does knowledge come from? how was it created? What is the meaning and truth of knowledge?’ is known as epistemology and our paper evolved into an exploration of the epistemology, or more accurately the multiple epistemologies, which are inherent in VGI. It’s easy to make the case that VGI is a new way of knowing the world, with (1) its potential to disrupt existing practices (e.g. the way OpenStreetMap provide alternative to official maps as shown in the image below) and (2) the way VGI both constrains contributions (e.g., 140 chars) and opens contributions (e.g., with its ease of user interface; with its multimedia offerings). VGI affords a new epistemology, a new way of knowing geography, of knowing place. Rather than observing a way of knowing, we were interested in what researchers thought was the epistemology of VGI. They were building it in real-time and attempting to ensure it conformed to existing ways of knowing. An analog would be: instead of knowing a religion from the inside, you construct your conception of it, with your own assumptions and biases, while you are on the outside. We argue that construction was occurring with VGI.

OpenStreetMap mapping party (Nono Fotos)

OpenStreetMap mapping party (Nono Fotos)

We likewise were interested in the way that long-standing critics of mapping technologies would respond to new sources of data and new platforms for that data. Criticism tends to be grounded in the structuralist works of Michel Foucault on power and how it is influenced by wider societal structures. Critics extended traditional notions of volunteerism and empowerment to VGI, without necessarily examining whether or not these were applicable to the new ‘ecosystem’ of geospatial apps companies, code and data. We also were curious why the critiques focussed on the software platforms used to generate the data (e.g., Twitter) instead of the data themselves (tweets). It was as if the platforms used to create and share VGI are embedded in various socio-political and economic configurations. However, the data were innocent of association with the assemblages. Lastly, we saw an unconscious shift in the Critical GIS/GIScience field from the collective to the personal. Historically, in the wider field of human geography, when we thought of civil society mapping together by using technology, we looked at collective activities like counter-mapping (e.g., a community fights an extension to airport runway by conducting a spatial analysis to demonstrate the adverse impacts of noise or pollution to the surrounding geography). We believe the shift occurred because Critical GIS scholars were never comfortable with community and consensus-based action in the first place. In hindsight, it probably is easier to critique the (individual) emancipatory potential as opposed to the (collective) empowerment potential of the technology. Moreover, Critical GIS researchers have shifted their attention away from geographic information systems towards the software stack of geospatial software and geosocial media, which raises question about what is considered under this term. For all of these reasons and more we decided to investigate the “world building” from both the instrumentalist scientists and from their critics.

We do use some philosophical framing—Borgmann has a great idea called the device paradigm—to analyse what is happening, and we hope that the paper will contribute to the debate in the critical studies of geographical information beyond the confines of GIScience to human geography more broadly.

About the authors: Renée E. Sieber is an Associate Professor in the Department of Geography and the School of Environment at McGill University. Muki Haklay is Professor of Geographical Information Science in the Department of Civil, Environmental and Geomatic Engineering at University College London.
Local News Research Project map of Toronto news coverage

Crosspost: How is your Toronto neighbourhood portrayed in the news? Check it out using these interactive maps

This post is cross-posted with permission from April Lindgren and Christina Wong at Local News Research Project. 

By April Lindgren and Christina Wong

Introduction
Concerns about how neighbourhoods are portrayed in the news have surfaced regularly in the Toronto area over the years. But are those concerns valid?

Interactive maps produced by the The Local News Research Project (LNRP) at Ryerson University’s School of Journalism are designed to help Toronto residents answer this question. The maps give the public access to data the research project collected on local news coverage by the Toronto Star and the online news website OpenFile.ca. The maps can be used by members of the public and researchers to:

  • get an overall sense of where news in the city is – and isn’t – covered
  • compare patterns of local news coverage by two different news organizations
  • examine the city-wide geographic patterns of reporting on crime, entertainment and other major news topics
  • examine news coverage in each of Toronto’s 44 wards including how often the news stories and photographs reference locations in a ward
  • see what story topics are covered in each ward

The maps are based on the Toronto Star’s local news coverage published on 21 days between January and August, 2011. Researchers have found that a two-week sample of news is generally representative of news coverage over the course of a year (Riffe, Aust & Lacy, 1993). The data for OpenFile.ca, which suspended publishing in 2012, were collected for every day in 2011 between January and August.

Click here to see the maps or continue reading to find out more about news coverage and neighbourhood stereotyping, how the maps work, and the role of open data sources in this project.

 

Local news and neighbourhood stereotyping
The decision to explore news coverage of Toronto neighbourhoods was prompted by concerns expressed by citizens and local politicians about how certain parts of the city are portrayed in the local media. Residents were furious (Pellettier, Brawley & Yuen, 2013), for instance, when Toronto Star columnist Rosie Dimanno referred to the city’s Scarborough area as “Scarberia” in an article about former mayor Rob Ford’s re-election campaign (DiManno, 2013). Back in 2007, then-mayor David Miller went so far as to contact all of the city’s news media asking them to cite the nearest main intersection rather than reporting more generally that a particular crime occurred in Scarborough (Maloney, 2007). In Toronto’s west end, the local city councillor suggested negative connotations associated with the Jane and Finch neighbourhood could be diffused by renaming it University Heights, but the idea was vehemently rejected by residents (Aveling, 2009).

A study that investigated how Toronto’s most disadvantaged neighbourhoods were covered by the Toronto Star concluded that there was very little coverage of news in these communities (Lindgren, 2009). The study, which examined Toronto Star local news reporting in 2008, also found that crime tended to dominate the limited coverage that did take place and suggested the problem could be rectified not by ignoring crime stories, but by increasing coverage of other sorts of issues in those communities.

 

Exploring the maps
The interactive maps allow users to explore local news coverage in the City of Toronto. A sample of local stories and photographs from the Toronto Star (the local newspaper with the largest circulation in the city) and OpenFile.ca (a community-based news website) were identified and analyzed in 2011 to capture data about story topics and mentions of geographic locations.

These maps make the data available to the public in a way that allows users to explore and compare media coverage in different areas of the city. Users can zoom in on a neighbourhood and discover all of the locations referenced within a neighbourhood. Each point on the map represents a location that was referenced in one or more news items. Users can click on any of these points to see a list of news articles associated with each location (Figure 1).

Figure 1. Users can click each point to find out about the news articles that referenced the location
Figure 1. Users can click each point to find out about the news articles that referenced the location

By clicking within a ward boundary, users can also access a summary chart describing the breakdown by subject of all local news coverage in that ward. Users interested in the Scarborough area, for instance, can zoom into that area on the map and click on each Scarborough ward to see what sorts of stories (crime, transit, entertainment, sports, etc.) were reported on in that ward (Figure 2).

Figure 2. Users can click within a ward to access charts summarizing news coverage by topic
Figure 2. Users can click within a ward to access charts summarizing news coverage by topic

Users interested in how and where a particular news topic is covered can access separate interactive maps for the top five subjects covered by the two news sources. Figure 3, for example, shows all locations mentioned in crime and policing stories published by the Toronto Star during the study’s sample period.

Figure 3. Toronto Star coverage of crime and policing news
Figure 3. Toronto Star coverage of crime and policing news

The role of open data sources in creating these maps
A total of 23 pre-existing datasets were used to support the creation of these interactive maps including relevant open datasets that were publically available online in 2008. The datasets were used to populate a list of geographic locations in the GTA that had the potential to be referenced in local news stories. Each dataset was assigned unique numerical codes and all 23 datasets were appended to a geographic reference table that coders could search. The incorporated reference list of geographic locations and features allowed for a more accurate and efficient coding process: Coders entering information about spatial references in local news items were able to select many of the referenced geographic locations from the pre-populated list rather than entering the information manually. This improved accuracy because it helped prevent human error and also sped up the coding process.

We would have preferred to use more open data sources during the initial development of the database, but this wasn’t possible due to limited availability of datasets with the spatial attributes that make mapping possible. At that time, only two of the 23 datasets used (approximately 8.7% of the total) were available from open data sources in a format that included geography (such as shapefiles). Both files were obtained from the City of Toronto’s Open Data website. These limitations meant that the majority of the database relied on contributions from private data sources.

The situation has improved over time as more open government data become available in geographic file formats that support research with spatial analysis. As of mid-2015, six more of the 23 datasets (two federal, one provincial and three municipal) used in the database have become available. If we were creating the database today, a total of eight datasets or 34.8% of the initial database could be populated using open data sources (Table 1).

Table 1. Availability of open data sources
Available in 2008 when the database was created Currently available
Private sources 21 15
Government open data 2   (8.7% of database) 8 (34.8% of database)
Total # of datasets 23 23

 

Since 2008, the Government of Canada has launched its own open data portal, joined the Open Government Partnership alongside other countries supporting the release of open government data, and adopted the G8 Open Data Charter (Standing Committee on Government Operations and Estimates, 2014). Provincial and municipal governments have made similar improvements to open data access. The Government of Ontario launched an online open data catalogue in 2012 and is currently developing an Open Data Directive to be implemented later this year (Fraser, 2015). The City of Toronto introduced its open data portal in 2009 and developed an Open Data Policy in 2012 (City of Toronto, n.d.).

As Table 1 suggests, however, further improvements are required to reduce barriers to research and innovation. A report from the Standing Committee on Government Operations and Estimates, for instance, recommended that the federal government provide data at smaller levels of geography, work together with different levels of government to establish standards and release data, and provide a greater variety of open data to reflect all government departments. The report noted that the release of open data can improve government efficiency, foster citizen engagement, and encourage innovation (Standing Committee on Government Operations and Estimates, 2014). Academic researchers have argued that improvements in the availability of open government data would stimulate valuable research and outcomes with economic and social value (Jetzek, Avital & Bjorn-Andersen, 2014; Kucera, 2015; Zuiderwijk, Janssen & Davis, 2014). Journalists are also pushing for easier and greater access to data (Schoenhoff & Tribe, 2014).

 

Conclusion
Research conducted by the Local News Research Project was made possible by public funds and as such the data should be widely available. The interactive maps are an attempt to fulfill that obligation.

While the maps capture only a snapshot of news coverage at a fixed point in time, they nonetheless demonstrate the importance of geospatial analysis in local news research (Lindgren & Wong, 2012). They are also a powerful data visualization tool that allows members of the public to independently explore media portrayals of neighbourhoods and the extent to which some parts of a city are represented in the news while others are largely ignored.

Finally, this mapping project also illustrates how open government data can foster research and how much there is still to do in terms of making data available to the public in useful formats.

 

The Local News Research Project was established in 2007 to explore the role of local news in communities. Funding for this research has been provided by Ryerson University, CERIS-The Ontario Metropolis Centre and the Social Sciences and Humanities Research Council.

About the authors: Lindgren is an Associate Professor in Ryerson University’s School of Journalism and Academic Director of the Ryerson Journalism Research Centre. Christina Wong is a graduate of Ryerson University’s Geographic Analysis program. Initial work on the maps was done in 2014 by GEO873 students Cory Gasporatto, Lorenzo Haza, Eaton Howitt and Kevin Wink from Ryerson University’s Geographic Analysis program.

 

References

Avaling, N. (2009, January 8). Area now being called University Heights, but some call change a rejection of how far we’ve come. Toronto Star, p. A10.

City of Toronto. (n.d.). Open Data Policy. Retrieved from http://www1.toronto.ca/wps/portal/contentonly?vgnextoid=7e27e03bb8d1e310VgnVCM10000071d60f89RCRD

DiManno, R. (2013, July 6). Ford fest makes a strategic move. Toronto Star, p. A2.

Jetzek, T., Avital, M. & Bjorn-Andersen, N. (2014). Data-driven innovation through open government data. Journal of Theoretical and Applied Electronic Commerce Research, 9(2), 100-120.

Fraser, D. (2015, May 1). Ontario announces more open data, public input. St. Catharines Standard. Retrieved from http://www.stcatharinesstandard.ca/2015/05/01/ontario-announces-more-open-data-public-input

Kucera, J. (2015). Open government data publication methodology. Journal of Systems Integration, 6(2), 52-61.

Lindgren, A. (2009). News, geography and disadvantage: Mapping newspaper coverage of high-needs neighbourhoods in Toronto, Canada. Canadian Journal of Urban Research, 18(1), 74-97.

Lindgren, A. & Wong, C. (2012). Want to understand local news? Make a map. 2012 Journalism Interest Group proceedings. Paper presented at Congress 2012 of the Humanities and Social Sciences conference. Retrieved from http://cca.kingsjournalism.com/?p=169

Maloney, P. (2007, January 16). Mayor sticks up for Scarborough. Toronto Star. Retrieved from http://www.thestar.com/news/2007/01/16/mayor_sticks_up_for_scarborough.html?referrer=

Pellettier, A., Brawley, D. & Yuen, S. (2013, July 11). Don’t call us Scarberia [Letter to the editor]. Toronto Star. Retrieved from http://www.thestar.com/opinion/letters_to_the_editors/2013/07/11/dont_call_us_scarberia.html

Riffe, D., Aust, C. F. & Lacy, S. R. (1993). The effectiveness of random, consecutive day and constructed week sampling. Journalism Quarterly, 70, 133-139.

Schoenhoff, S. & Tribe, L. (2014). Canada continues to struggle in Newspapers Canada’s annual FOI audit [web log post]. Retrieved from https://cjfe.org/blog/canada-continues-struggle-newspapers-canada%E2%80%99s-annual-foi-audit

Standing Committee on Government Operations and Estimates. (2014). Open data: The way of the future: Report of the Standing Committee on Government Operations and Estimates. Retrieved from http://www.parl.gc.ca/content/hoc/Committee/412/OGGO/Reports/RP6670517/oggorp05/oggorp05-e.pdf

Zuiderwijk, A., Janssen, M. & Davis, C. (2014). Innovation with open data: Essential elements of open data ecosystems. Information Polity, 19(1, 2), 17-33.

Call for Papers (Book): Geoweb Policy, Law, and Ethics

Geothink_Logo_iTunes

Hello everyone,

We are putting together a draft prospectus for consideration by the University of Ottawa Press for their Law, Technology and Media book series. The edited volume will focus on the legal, policy, regulatory, and ethical issues arising from the geoweb. Anticipated issues that the volume will cover include privacy, surveillance, IP, licensing, open data, the public/private divide, citizen engagement, and governance.

We seek brief expressions of interest for chapters. Please send to both Elizabeth and Leslie, by August 30, a chapter title and a short (150-200 word) abstract for our consideration.

Thank you,
Leslie and Elizabeth

Crosspost: Green Cities and Smart Cities: The potential and pitfalls of digitally-enabled green urbanism

The Vancouver Convention Centre in Vancouver, BC, Canada was the world's first LEED Platinum-certified convention center. It also has one of the largest green roofs in Canada. Image Credit: androver / Shutterstock.com

The Vancouver Convention Centre in Vancouver, BC, Canada was the world’s first LEED Platinum-certified convention center. It also has one of the largest green roofs in Canada. Image Credit: androver / Shutterstock.com

This post is cross-posted with permission from Alexander Aylett, from UGEC Viewpoints. Aylett is an Assistant Professor at the Centre on Urbanisation, Culture and Society at the National Institute for Scientific Research (UCS-INRS) in Montreal, Quebec.

By Alexander Aylett

Since its early days, the discourse around “smart cities” has included environmental sustainability as one of its core principles. The application of new digital technologies to urban spaces and processes is celebrated for its ability to increase the well-being of citizens while reducing their environmental impacts. But this engagement with sustainability has been limited to a technocratic focus on energy systems, building efficiency, and transportation. It has also privileged top-down interventions by local government actors. For all its novelty, the smart cities discussion is operating with a vision of urban sustainability that dates from the 1990s, and an approach to planning from the 1950s.

This definition of “urban sustainability” overlooks key facets of a city’s ecological footprint (such as food systems, resource consumption, production related greenhouse gas emissions, air quality, and the urban heat island effect). It also ignores the ability of non-state actors to contribute meaningfully to the design and implementation of urban policies and programs. But that doesn’t need not be the case. In fact, if employed properly, new information technologies seem like ideal tools to address some of urban sustainability’s most persistent challenges.

Progress and Lasting Challenges in Local Climate Governance

Let’s take a step back. Often discussions of smart cities begin with an account of the capabilities of specific technologies or interfaces and then imagine urbanism – and urban sustainability – through the lense of those technologies. I’d like to do the opposite: beginning with the successes and lasting challenges faced by urban sustainability and interpreting the technologies from within that context. To understand the role that “smart” technologies could play in enabling sustainable cities, it’s useful to first look at what we have managed to accomplish so far, and what still needs to be done.

For those of us working on sustainable cities and urban responses to climate change, the past two decades have been a period of both amazing successes and enduring challenges. In the early 1990s a handful of cities began promoting the (at that time) counterintuitive idea that local governments had a key role to play in addressing global climate change. Since then, the green cities movement has won significant discursive, political, and technical battles.

Global inter-municipal organizations like ICLEI or the C40 now have memberships that represent thousands of cities. Two decades of work have created planning standards and tools and an impressive body of “best practice” literature. Through the sustained efforts of groups like ICLEI, cities are now recognized as official governmental stakeholders in the international climate change negotiations coordinated by the United Nations.

But – crucially – real urban emissions reductions are lagging well below what is needed to help keep global CO2 within safe limits. Looking at the efforts of individual cities and the results of a global Urban Climate Change Governance survey that I conducted while at MIT (Aylett 2014, www.urbanclimatesurvey.com ) shows why. Apart from a small contingent of charismatic cities like Vancouver, Portland, or Copenhagen, cities are struggling to move beyond addressing the “low hanging fruit” of emission from municipal facilities ( i.e., vehicle fleet, municipal buildings, street lighting – known as “corporate emissions”) to taking action on the much more significant emissions generated by the broader urban community (i.e., business, industry, transportation, and residential emissions).

This problem has been with us since the early days of urban climate change responses. But how we understand it has changed significantly. Where some cities used to inventory only their corporate emissions, this is now rare. Current guidelines cover community-wide emissions and work is underway to create a global standard for emissions inventories that will also engage with emissions produced in the manufacture of the goods and services consumed within cities (see Hoornweg et al. 2011).

Built on the increased scope of our technical understanding of urban emissions, is a change in how we understand the work of governing climate change at the local level. A top-down vision of climate action focused on the regulatory powers of isolated local government agencies is being replaced by one that is horizontal, relational, and collaborative. This approach transforms relationships both inside and outside of local governments, by linking together traditionally siloized municipal agencies and also forging partnerships with civil-society and business actors (Aylett 2015).

The increased prominence of non-state actors in urban climate change governance has led to growing calls for partnerships across the public-private divide (Osofsky et al. 2007; Andonova 2010; Bontenbal and Van Lindert 2008). These partnerships play an important role in overcoming gaps in capacity, translating the climate change impacts and response options into language that is meaningful to different groups and individuals, and accelerating the development of solutions. Follow-up analysis of the 2014 MIT-ICLEI Climate survey shows that these partnerships have an important positive impact on the scope of concrete emissions reductions. Cities with stronger partnerships appear to be more able to create concrete emissions reductions outside of areas directly controlled by the municipality.

The street car in Portland, Oregon, USA. Image Credit: Shutterstock.com

The street car in Portland, Oregon, USA. Image Credit: Shutterstock.com

This evolution in approaches to climate change planning follows a broader current in urban planning more generally which, since the 1960s have moved away from expert driven and technocratic processes and created increasing amounts of space for participatory processes and facilitative government.

In a nutshell, an increasingly complex and holistic technical understanding of urban emissions is being matched by an increasing horizontal and networked approach to governing those emissions. (A similar shift is taking place in the more recent attention to urban adaptation and resilience.)

But plans and programs based on this understanding quickly run into the significant barriers of institutional siloization and path dependency, a lack of effective information sharing, challenges of data collection and analysis, and difficulty mobilizing collective and collaborative action across multiple diverse and dispersed actors (Aylett 2014). The strength of collaborative multi-stakeholder responses is also their weakness. While effective climate change action may not be possible without complex networks of governance, coordinating these networks is no simple task. The subject of urban climate change governance has been the focus of an expanding body of research (Aylett 2015, 2014, 2013; Betsill & Bulkeley 2004, 2007; Burch 2010; Burch et al. 2013; Romero-Lankao et al. 2013.)

“Smart” Urban Climate Governance

Seen from this perspective, the allure of “smart” approaches to green cities is precisely the fact that information technology tools seem so well suited to the challenges that have stalled progress so far. Collecting, sharing and analysing new and existing data, and coordinating complex multi-scalar social networks of collaborative design and implementation are precisely what has drawn attention to new technologies in other sectors.

Disappointingly, current applications of a data-driven and technologically enabled approach to urban sustainability are far from delivering on this potential. Reading through the literature shows that the many interesting works that address the impacts of new technologies on urban governance (for example Elwood 2010, Evans-Cowley 2010, Goldsmith and Crawford 2015, Moon 2002) have nothing to say about the governance of urban sustainability. Work that does address environmental sustainability is dominated by a technocratic focus on energy systems, building efficiency, and transportation that privileges top-down action by municipal experts and planning elites (The Climate Group 2008, Boorsma & Wagener 2007, Kim et al. 2009, Villa & Mitchell 2009). This literature review is ongoing, and I continue to hope to find a body of work that combines a developed understanding of urban sustainability with a detailed reflection on digital governance. As it is, we seem to be working with outdated approaches to both urban sustainability and planning.

An off-shore wind farm near Copenhagen, Denmark. Image Credit: Shutterstock.com

An off-shore wind farm near Copenhagen, Denmark. Image Credit: Shutterstock.com

How to update this approach, and use the full potential of data-driven, technologically enabled, and participatory approaches to spur accelerated transitions to sustainable cities is a key question. This research is necessary if we are going to unlock the full potential of the “smart” urbanism to address the necessity of building sustainable cities. It is also important that we avoid rolling back the clock on two decades of “green cities” research by basing our digital strategies around outdated understandings of the urban sustainability challenge.

Conclusions

Cities are responsible for as much as 70% of global greenhouse gas emissions and consume 75 percent of the world’s energy (Satterthwaite 2008). These figures are often repeated. But taking action at that scale requires both technological and socio-institutional innovations. Efforts to reduce urban emissions are challenged by the complexity of coordinating broad coalitions of action across governmental, private, and civil-society actors, and the need to effectively collect, share, and analyse new and existing data from across these traditionally siloized sectors.

These complexities have played an important role in limiting actual urban emissions reductions far below what is needed to stabilize global emissions within a safe range. Interestingly, these complexities are also the very strengths of emerging information and communications technologies (ICT) tools and Geoweb enabled approaches to urban planning and implementation. Currently, the use of “smart” approaches to address the urban climate challenge has been limited to narrow and technocratic initiatives. But much more is possible. If effective bridges can be built between the ICT and Urban Sustainability sectors, a profound shift in approaches to the urban governance of climate change could be possible. It is important to increase both sustainability and digital literacy among those involved. Only then will innovations in urban sustainability benefit from a deep understanding of both the new tools at our disposal, and the complex challenge to which we hope to apply them.

(A previous version of this was presented as part of the Geothink pre-event at the 2015 American Association of Geographers conference in Chicago. IL. See: www.geothink.ca)

Alexander Aylett is Assistant Professor at the Centre on Urbanisation, Culture and Society at the National Institute for Scientific Research (UCS-INRS) in Montreal, Quebec, Canada.

A First Hand Account of McGill University’s Team-CODE’s Experiences in the 1st Annual ECCE App Challenge hosted by Environmental Systems Research Institute (ESRI)

By Jin Xing

I was one of three Geothink students who competed in the Environmental Systems Research Institute’s (ESRI) ECCE 1st Annual App Challenge hosted by the institute’s Canada Centre of Excellence. Team CODE-McGill, which consisted of McGill University students Matthew Tenney, Carl Hughes, and myself placed second in the competition that concluded on March 20 with the announcement of a winning group from the University of Waterloo.

Although our three team members each has a different research interest, each of us studies topics related to open data. Our Community Open Data Engage (CODE) application was sparked by an exchange I had with Hughes when we discovered we both call Toronto, Ontario home after the competition had already begun. In fact, it was only after Hughes told me that my neighbourhood was “a better” place to live that we began to interrogate the question of how to evaluate a community using open data.

As we worked on our submission, we noticed that community-level open data attracts more attention than data on the whole city. In particular, we found citizens were more concerned with data on traffic, education, and recreation resources in their own neighbourhoods compared to other types of data. Our creation: A new approach for exploring a community using an open data platform that connects people and communities.

However, the application that we designed required a number of trade-offs to be decided in the span of only one week. First, we struggled to choose whether to include more data or to favour an easy-to-use interface. In particular, we wanted to develop functionality to integrate a greater variety of community data but didn’t want the application to become too hard to use. After several hours of discussion, we decided to favour an approach that centered on making open data “easy and ready to use.”

The second trade-off involved the selection of ESRI JavaScript APIs. In particular, we had to choose ESRI ArcGIS API or ESRI Leaflet for open data integration and visualization. At the beginning, I preferred the ArcGIS API due to its rich functions. But Tenney pointed out it was actually over-qualified and may delay the page loading which caused the team to decide to use Leaflet.

Finally, we had to decide how to integrate social media. In particular, we needed to decide whether the Twitter content should be loaded from data streaming or just retrieved from the back-end. All of us felt it was cool to have a real-time Twitter widget on our application’s page, but we didn’t know how to get it to choose the right tweets. For example, a user named Edmonton might say nothing about the City of Edmonton city, and our code would have needed to filter it out in real-time. Considering the difficulty of developing such a data filtering AI in one week, we decided to include it only on the back-end. To accomplish this, we used Python to develop a way to harvest and process data, while the ESRI Leaflet handled all the front-end data integration and visualization.

Our application included data on school locations, health facility locations, grocery store locations, gas station locations, green space, cultural facilities, emergency services, census dissemination areas and Twitter data, all of which were presented as different map layers. We employed the Agile developing method for CODE, meaning we quickly built the prototype for CODE and tested it, then repeated this process with additional functions or by re-developing bad code.

In actuality, though, we built three prototypes in the first two days and spent another two days for testing, selecting and re-developing. The Agile method helped us keep CODE always functional and easy to extend. The only drawback of using Agile was the local code synchronization become necessary before we pushed it to GitHub. If two of us pushed at the same time with different code, our GitHub would be massed up. By late Thursday night, we had nearly finished all the planned coding and had even begun to improve the user experience. The search widget and navigation buttons were added in the last day to make open data easy and ready for use in our CODE application.

We felt that by putting information in the hands of concerned citizens and community leaders, CODE is a proof-of-concept for data-driven approaches to building a strong sense of communityacross cities. CODE also connects people and governments by allowing them to create forums for conversation in specific communities across a city or search social media sites to find other people in their area.

Furthermore, by integrating and visualizing open data at a community scale, CODE demonstrates a new approach for community exploration. In fact, users can search and select different open data for certain communities on one map, and corresponding statistics are shown as pop-ups. In the initial phase, we provided this community exploration service for citizens in Edmonton and Vancouver.

Overall, I felt attending this ECCE App challenge was a great experience to integrate ESRI web technologies with open data research. It proves open data can act as the bridge between citizen and cities, and that ESRI products significantly simplify the building of just such a bridge. We believe more applications will continue to be inspired by the ECCE App challenge and that open data will become closely used in everyday life. Thanks to ESRI, we got a chance to help shape the next-generation of community exploration.

If you have thoughts or questions about this article, get in touch with Jin Xing, Geothink’s Information Technology Specialist, at jin.xing@mail.mcgill.ca.

Spotlight on Recent Publications: Exploring the Hyperbole Behind Crisis Mapping Through Chronic Community Development Issues

By Drew Bush

McGill University Masters Student Ana Brandusescu, lead author on the paper "Confronting the hype: The use of crisis mapping for community development."

McGill University Masters Student Ana Brandusescu, lead author on the paper “Confronting the hype: The use of crisis mapping for community development.”

In a paper published this month, Geothink researchers critically examined the role that crisis mapping software such as Crowdmap can play when used to instead facilitate development issues in three Canadian communities in Vancouver and Montreal. They argue that such platforms hold many technological constraints, including an intrinsic short-term feel that makes it difficult to deploy on the chronic, long-term issues common to community development.

Entitled “Confronting the hype: The use of crisis mapping for community development,” the paper was published in Convergence: The International Journal of Research into New Media Technologies by McGill University Masters Student Ana Brandusescu and Associate Professor Renee Sieber along with Université du Quebec à Montréal Professor Sylvie Jochems. Please find the abstract below.

Each of the case studies examined in the paper involved a different set of circumstances. In Montreal, the researchers worked with a community of low-income immigrants in single-family homes who predominantly spoke French. In contrast, one community in Vancouver consisted of young middle-class families living in subsidized student housing while the other was an ethnically diverse low-income community living in rented housing. Both Vancouver communities predominantly spoke English.

“The Vancouver cases had issues resembling crises, for example, immediate rezoning, antidensification, and loss of social housing,” the researchers wrote in the paper. “The Montreal organizers wished to address longer term issues like the recording of community assets.”

In each community, the researchers prepared participants at initial community meetings by using storyboards or comic books to explain the process of mapping. Furthermore, a manual they created helped application managers and community members understand how to manage the application, submit reports (via texts, tweets, Web reports, e-mails and smartphone message), geolocate reports, and handle messages that might contain personal identifiers or foul language. In Vancouver, the managers consisted of community activists while in the Montreal case the managers were part-time professional community organizers.

Although each community differed in their implementation of the mapping software and program, the findings were striking.

“In this article, we explored the reality behind the hype of crisis mapping and revealed that hype through its repurposing to community development,” they write in their conclusion. “We confronted the zero-cost argument and found numerous technology constraints, which confirmed the challenges of introducing a new technological medium to community development processes.”

“Burns asserted that knowledge politics concerns the role of power in developing a map but the politics also refers to the overall hype to which we so easily succumb,” they add later in their conclusion in reference to a paper by a researcher at the University of Washington, Ryan Burns, entitled Moments of closure in the knowledge politics of digital humanitarianism. “If we acknowledge and then work past the hype then perhaps we will achieve more meaningful and sustainable systems.”

Abstract
Confronting the hype: The use of crisis mapping for community development
This article explores the hyperbole behind crisis mapping as it extends into more long term or ‘chronic’ community development practices. We critically examined developer issues and participant (i.e. community organization) usage within the context of local communities. We repurposed the predominant crisis mapping platform Crowdmap for three cases of community development in Canadian anglophone and francophone. Our case studies show mixed results about the actual cost of deployment, the results of disintermediation, and local context with the mapping application. Lastly, we discuss the relationship of hype, temporality, and community development as expressed in our cases.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Spotlight on Recent Publications: Teresa Scassa at the Intersection of Intellectual Property Rights and Municipal Transit Data

By Drew Bush

faculty_olympics

Teresa Scassa is Canada Research Chair in Information Law at the University of Ottawa.

This story was originally reported on Teressa Scassa’s personal blog which you can find here.

In a paper just published in the Fordham Urban Law Journal, Geothink researcher Teresa Scassa argues that the actual laws governing intellectual property (IP) rights are often surprisingly irrelevant in disputes over rights to municipal transit data. Instead, she finds that being in a position to make a claim to IP rights is often more important than actually having a good claim.

“How people decide to interact with each other is more important than what their precise legal rights might be,” Scassa, the Canada research chair in information law at University of Ottawa, wrote in an e-mail to Geothink.ca. “Often, to understand the precise boundaries of those rights it is necessary to litigate and one or both parties may lack the resources to go to court. So, in those circumstances, parties may reach an understanding of how they will set the boundaries of their relationships.”

Her paper, entitled Public Transit Data Through an Intellectual Property Lens: Lessons About Open Data, examines some of the challenges presented by the transition from ‘closed’ to open data within the municipal context. She completed the paper as part of a Geothink project examining open data in a concrete context that’s particular to municipalities.

“In the municipal transit data context, there was generally an imbalance of resources between developers and municipalities, and there was little desire on either part to go to court,” she added. “Nevertheless, in the early days, municipal transit authorities asserted their IP rights using cease and desist letters. This assertion of IP rights was met with arguments about the need for open data, and eventually compromises were reached around open data that shifted over time, and varied from one municipality to another.”

In the paper, she examines how these legal developments have impacted the use of real-time transit data by developers seeking to make use of this data in digital applications and corporations hoping to add value to products and services they offer. In particular, the paper covers three types of data: 1) Route maps; 2) Static data (such as bus timetables that only change seasonally); 3) And, real-time GPS data generated by units installed on transit vehicles.

A number of municipalities exerted their IP rights over such data because of concerns that ranged from ensuring its quality and authenticity to preserving the ability to make data available on a cost-recovery basis.

“The emerging open data movement shifted some of these concerns and created a new set of expectations and practices around open municipal transit data,” she wrote in her e-mail. “As data become more complex (with the advent of real-time GPS data, for example) the IP issues shifted and changed again, raising new questions about open data in this context. This is where the next phase of my research will take me.”

To find out more about Teresa Scassa’s work, visit her personal blog here or follow her on Twitter @teresascassa. For more on IP, check out another of her recent papers (written with Univeristy of Ottawa doctoral student Haewon Chung) that analyzes various types of volunteer citizen science activities to determine whether they raise legal questions about IP ownership.

Find a link to the article along with its abstract below.

Public Transit Data Through an Intellectual Property Lens: Lessons About Open Data

This paper examines some of the challenges presented by the transition from ‘closed’ to open data within the municipal context, using municipal transit data as a case study. The particular lens through which this paper examines these challenges is intellectual property law. In a ‘closed data’ system, intellectual property law is an important means by which legal control over data is asserted by governments and their agencies. In an ‘open data’ context, the freedom to use and distribute content is a freedom from IP constraints. The evolution of approaches to open municipal transit data offers some interesting examples of the role played by intellectual property at every stage in the evolution of open municipal transit data, and it highlights not just the relationship between municipalities and their residents, but also the complex relationships between municipalities, residents, and private sector service providers.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Spotlight on Recent Publications: Interrogating the Nature of Geosocial Data with Stéphane Roche

BEACONS

London Olympic wayfinding beacon (Photo courtesy of www.mudarchitecture.com).

By Drew Bush

In two articles published this January, Geothink researcher Stéphane Roche and his doctoral student Teriitutea Quesnot argue that not all geosocial data is equivalent, and that better data on the social significance of a landmark could greatly enhance our understanding of human wayfinding behavior. A Professor of Geomatics at University of Laval, Roche’s research over the past five years has focused on how new forms of digital spatiality affect spatial reasoning skills, and the capacity of individuals to engage in the city.

Entitled “Measure of Landmark Semantic Salience through Geosocial Data Streams,” the first paper was published by Roche in the ISPRS International Journal of Geo-Information. The authors write that a lot of research “in wayfinding is done in order to enable individuals to reach as quickly as possible a desired destination, to help people with disabilities by designing cognitively appropriate orientation signs, and reduce the fact of being lost.”

Previous researchers in the field of geo-cognition have tried to characterize the salience of landmarks in human wayfinding behaviour. Most have classified differing landmarks by visual, structural and semantic cues. However, the social dimensions of a landmark, such as how they are practised or recognized by individuals or groups, had been excluded from its semantic salience (or often reduced to historical or cultural cues), according to the authors.

Instead, the authors follow in a tradition of research which utilizes text mining from the web to understand how places are expressed by Internet users rather than relying on how they are visually perceived. Such an approach has been made possible by social media and mobile communications technology that has resulted in vast user-generated databases that constitute “the most appropriate VGI data for the detection of global semantic landmarks.”

In conducting their research, the authors examined world famous landmarks and detected semantic landmarks in the cities of Vienna and Paris using data from Foursquare API v2 and Facebook API v2.1. from September 29, 2014 to November 15, 2014.

In a second paper entitled “Platial or Locational Data? Toward the Characterization of Social Location Sharing,” the authors expanded on this theme in arguing that not all geosocial data is equal. The paper was presented at 48th Hawaii International Conference on Systems Sciences this past January.

Some data, which the authors consider “platial,” relates more to users experiences of a given place while “spatial” data is tied to the actual coordinates of a place. In the context of geosocial data, spatial data might mean the exact location of the Eiffel tower while palatial could refer to a person passing by the Eiffel tower or taking a photo of it from another location.

Because each can potentially represent a very different kind of data point, they must be treated differently. As the authors write, “With the objective of a better understanding of urban dynamics, lots of research projects focused on the combination of geosocial data harvested from different social media platforms. Those analyses were mainly realized on a traditional GIS, which is a tool that does not take into account the platial component of spatial data. Yet, with the advent of Social Location Sharing, the inconvenience of relying on a classic GIS is that a large part of VGI is now more palatial than locational.”

Find links to each article along with their abstracts below.

Measure of Landmark Semantic Salience through Geosocial Data Streams

ABSTRACT

Research in the area of spatial cognition demonstrated that references to landmarks are essential in the communication and the interpretation of wayfinding instructions for human being. In order to detect landmarks, a model for the assessment of their salience has been previously developed by Raubal and Winter. According to their model, landmark salience is divided into three categories: visual, structural, and semantic. Several solutions have been proposed to automatically detect landmarks on the basis of these categories. Due to a lack of relevant data, semantic salience has been frequently reduced to objects’ historical and cultural significance. Social dimension (i.e., the way an object is practiced and recognized by a person or a group of people) is systematically excluded from the measure of landmark semantic salience even though it represents an important component. Since the advent of mobile Internet and smartphones, the production of geolocated content from social web platforms—also described as geosocial data—became commonplace. Actually, these data allow us to have a better understanding of the local geographic knowledge. Therefore, we argue that geosocial data, especially Social Location Sharing datasets, represent a reliable source of information to precisely measure landmark semantic salience in urban area.

Platial or Locational Data? Toward the Characterization of Social Location Sharing

ABSTRACT

Sharing “location” information on social media became commonplace since the advent of smartphones. Location-based social networks introduced a derivative form of Volunteered Geographic Information (VGI) known as Social Location Sharing (SLS). It consists of claiming “I am/was at that Place”. Since SLS represents a singular form of place-based (i.e. platial) communication, we argue that SLS data are more platial than locational. According to our data classification of VGI, locational data (e.g. a geotagged tweet which geographic dimension is limited to its coordinate information) are a reduced form of platial data (e.g. a Swarm check-in). Therefore, we believe these two kinds of data should not be analyzed on the same spatial level. This distinction needs to be clarified because a large part of geosocial data (i.e. spatial data published from social media) tends to be analyzed on the basis of a locational equivalence and not on a platial one.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.