Category Archives: Refereed Publications & Conferences

Geothink Student Evan Hamilton Explores Canadian Municipal Open Data and the Role of Journalism

headshot of Evan Hamilton

Geothink student Evan Hamilton recently defended his master’s thesis on Toronto data journalists’ use of open data.

By Naomi Bloch

Data journalists are some of the most active users of government open data in Canada. In his recently defended thesis, Evan Hamilton, a master’s student in the University of Toronto’s Faculty of Information, examined the role of data journalists as advocates, users, and producers of open data.

Hamilton’s thesis, titled “Open for reporting: An exploration of open data and journalism in Canada,” addressed four research questions:

  1. Are open data programs in Ontario municipalities developing in a way that encourages effective business and community development opportunities?
  2. How and why do journalists integrate open data in reporting?
  3. What are the major challenges journalists encounter in gaining access to government data at the municipal level?
  4. How does journalism shape the open data development at both the policy level and the grassroots level within a municipality?

To inform his work, Hamilton conducted in-depth, semi-structured interviews with three key data journalists in the City of Toronto: Joel Eastwood at the Toronto Star, William Wolfe-Wylie at the CBC, and Patrick Cain at Global News. While open data is often touted as a powerful tool for fostering openness and transparency, in his paper Hamilton notes that there is always the risk that “the rhetoric around open data can also be employed to claim progress in public access, when in fact government-held information is becoming less accessible.”

In an interview with Geothink, Hamilton explained that the journalists made important distinctions between the information currently available on Canadian open data portals and the information they typically seek in order to develop compelling, public-interest news stories. “One of the big things I took away from my interviews was the differentiation that journalists made between Freedom of Information and open data,” said Hamilton. “They were using them for two completely different reasons. Ideally, they would love to have all that information available on open data portals, but the reality is that the portals are just not as robust as they could be right now. And a lot of that information does exist, but unfortunately journalists have to use Freedom of Information requests to get it, which is a process that can take a lot of time and not always lead to the best end result.”

Legal provisions at various levels of government allow Canadians to make special Freedom of Information requests to try to access public information that is not readily available by other means. A nominal fee is usually charged. In Toronto, government agencies generally need to respond to such requests within 30 days. Even so, government responses do not always result in the provision of usable data, and if journalists request large quantities of information, departments have the right to extend the 30-day response time. For journalists, a delay of even a few days can kill a story.

While the journalists Hamilton interviewed recognized that open data portals were limited by a lack of resources, there was also a prevailing opinion that many government agencies still prefer to vet and protect the most socially relevant data. “Some were very skeptical of the political decisions being made,” Hamilton said. “Like government departments are intentionally trying to prevent access to data on community organizations or data from police departments looking at crime statistics in specific areas, and so they’re not providing it because it’s a political agenda.”

Data that helps communities

In his thesis, Hamilton states that further research is needed to better understand the motivations behind government behaviours. A more nuanced explanation involves the differing cultures within specific municipal institutions. “The ones that you would expect to do well, do do well, like the City of Toronto’s Planning and Finance departments,” Hamilton said. “Both of them provide really fantastic data that’s really up-to-date, really useful and accessible. They have people you can talk to if you have questions about the data. So those departments have done a fantastic job. It’s just having all the other departments catch up has been a larger issue.”

An issue of less concern to the journalists Hamilton consulted is privacy. The City’s open data policy stresses a balance between appropriate privacy protection mechanisms and the timely release of information of public value. Hamilton noted that in Toronto, the type of information currently shared as open data poses little risk to individuals’ privacy. At the same time, the journalists he spoke with tended to view potentially high-risk information such as crime data as information for which public interest should outweigh privacy concerns.

Two of the three journalists stressed the potential for data-driven news stories to help readers better understand and address needs in their local communities. According to Hamilton’s thesis, “a significant factor that prevents this from happening at a robust level is the lack of data about marginalized communities within the City.”

The journalists’ on-the-ground perspective echoes the scholarly literature, Hamilton found. If diverse community voices are not involved in the development of open data policies and objectives, chances for government efforts to meet community needs are hampered. Because of their relative power, journalists do recognize themselves as representing community interests. “In terms of advocacy, the journalists identify themselves as open data advocates just because they have been the ones pushing the city for the release of data, trying to get things in a usable format, and creating standard processes,” Hamilton said. “They feel they have that kind of leverage, and they act as an intermediary between a lot of groups that don’t have the ability to get to the table during negotiations and policy development. So they’re advocating for their own interests, but as they fulfill that role they’re advocating for marginalized communities, local interest groups, and people who can’t get to the table.”

Policy recommendations

Hamilton’s research also pointed to ways in which data journalists can improve their own professional practices when creating and using open data. “There needs to be more of a conversation between journalists about what data journalism is and how you can use open data,” Hamilton said. “When I talked to them, there was not a thing like, ‘Any time you use a data set in your story you cite the data set or you provide a link to it.’ There’s no standard practice for that in the industry, which is problematic, because then they’re pulling numbers out of nowhere and they’re trusting that you’ll believe it. If you’re quoting from a data set you have to show exactly where you’re getting that information, just like you wouldn’t anonymize a source needlessly.”

While Hamilton concentrated on building a picture of journalists’ open data use in the City of Toronto, his findings resulted in several policy recommendations for government agencies more broadly. First, Hamilton stressed that “as a significant user group, journalists need to be consulted in a formal setting so that open data platforms can be better designed to target their specific needs.” This is necessary, according to Hamilton, in order to permit journalists to more effectively advocate on behalf of their local communities and those who may not have a voice.

Another recommendation is aimed at meeting the needs of open data users who have different levels of competency. Although he recognizes the challenges involved, in his concluding chapter Hamilton writes, “Municipal governments need to allocate more resources to open data programs if they are going to be able to fulfill the needs of both a developer class requiring technical specifications, and a general consumer class that requires tools (for example. visualizations and interactives) to consume the data.”

Finally, Hamilton recommends that municipalities engage in more formal efforts “to combat internal culture in municipal departments that are against publishing public information. Data should be viewed as a public service, and public data should be used in the public interest.”

If you have any questions for Evan, reach him on Twitter here: @evanhams


Evan Hamilton successfully defended his Master of Information thesis on September 29 at the Faculty of Information, University of Toronto. His work was supervised by Geothink co-applicant researcher Leslie Regan Shade, associate professor in the University of Toronto’s Faculty of Information. Other committee members included University of Toronto’s Brett Caraway and Alan Galey (chair), as well as April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

 Abstract

This thesis describes how open data and journalism have intersected within the Canadian context in a push for openness and transparency in government collected and produced data. Through a series of semi-structured interviews with Toronto-based data journalists, this thesis investigates how journalists use open data within the news production process, view themselves as open data advocates within the larger open data movement, and use data-driven journalism in an attempt to increase digital literacy and civic engagement within local communities. It will evaluate the challenges that journalists face in gathering government data through open data programs, and highlight the potential social and political pitfalls for the open data movement within Canada. The thesis concludes with policy recommendations to increase access to government held information and to promote the role of data journalism in a civic building capacity.

Reference: Hamilton, Evan. (2015). Open for reporting: An exploration of open data and journalism in Canada (MI thesis). University of Toronto.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Spotlight on Recent Publications: Critical Reflections on Outcomes from Three Geoweb Partnerships

ACME_2015By Naomi Bloch

Exploring university–community partnerships

Participatory geospatial technologies have the potential to support and promote citizen engagement. This great promise has led to more collaborations between academics and community partners interested in pursuing this aim. In their recently published paper, “A web of expectations: Evolving relationships in community participatory geoweb projects,” four Geothink researchers and their colleagues cast a reflective eye on the participatory action research processes behind three completed geoweb partnership projects.

Co-author Jon Corbett, an associate professor in Community, Culture and Global Studies at the University of British Columbia’s Okanagan campus, sees their ACME journal article as helping to fill a gap in the geoweb literature.  “For me, one of the things I’m most interested in is how—in a truthful and well-positioned way—we can talk about the veracity of the work that we’ve done in regards to its ability to actually bring about impact and social change,” Corbett said.
In the article, the authors compare the different cases in order to consider some of the tangible, empirical challenges that the projects encountered, concentrating on the frictions that can occur where technical and social considerations intersect.

screenshot of local food map interface

Central Okanagan Community Food Map interface

Participatory geoweb initiatives commonly rely on out-of-the-box mapping tools. For these three projects, a central aim was to employ the expertise of the university researchers to co-develop and co-evaluate custom geospatial web tools that could address community partners’ objectives. Ideally, such collaborations can benefit all parties. Researchers can learn about the potential and the limitations of the geoweb as a tool for civic engagement while partners have the opportunity to reflect on their objectives and access a wider tool set for accomplishing them. In reality, collaborations require compromises and negotiations. The question then becomes: when are researchers’ academic objectives and partners’ community objectives truly complementary?

In the first case study, the geoweb was used to create a participatory business promotion website for a rural Quebec community, intended as one component of a larger regional economic development strategy. The second case was a collaboration between two university partners and a cultural heritage organization in Ontario. The partners hoped the customized online tool could “serve as a ‘living’ repository of cultural heritage information that was both accessible to the public and could facilitate the contribution of knowledge from the public.” In the third project, university researchers worked with government and grassroots organizations at local as well as provincial levels. The vision in this case was to enable non-expert community members in the Okanagan region to share their own knowledge and experiences about local food and its availability.

Corbett explained that in reflecting on their work, the researchers realized that as social scientists with very specific domains of expertise in political science, geographic information systems, and community research, “the types of skills we needed to negotiate the relationships were far different from the sorts of traditional disciplinary fields that we work in.”  Their collaborators tended to identify the academics more as technical consultants than scholars. As the authors write, “most academics remain untrained in software development, design, marketing, long-term application management and updating, legal related issues, [and] terms of service.”

Although the three collaborations were quite different in terms of the publics involved as well as the negotiated objectives of the projects and the tools employed to achieve them, the authors identified several key common themes. The authors note, “In all three case studies, we found that the process of technology development had substantial influence on the relationship between university developers and community organization partners. This influence was seen in the initial expectations of community partners, differential in power between researcher and community, sustainability of tools and collaborations, and the change from research collaboration towards ‘deal making.'”

In the end, Corbett said, “All of the projects were extremely precarious in how we could assign value or success to them. The paper was really an academic reflection on the outcomes of those three different projects.”

Abstract

New forms of participatory online geospatial technology have the potential to support citizen engagement in governance and community development. The mechanisms of this contribution have predominantly been cast in the literature as ‘citizens as sensors’, with individuals acting as a distributed network, feeding academics or government with data. To counter this dominant perspective, we describe our shared experiences with the development of three community-based Geospatial Web 2.0 (Geoweb) projects, where community organizations were engaged as partners, with the general aim to bring about social change in their communities through technology development and implementation. Developing Geoweb tools with community organizations was a process that saw significant evolution of project expectations and relationships. As Geoweb tool development encountered the realities of technological development and implementation in a community context, this served to reduce organizational enthusiasm and support for projects as a whole. We question the power dynamics at play between university researchers and organizations, including project financing, both during development and in the long term. How researchers managed, or perpetuated, many of the popular myths of the Geoweb, namely that it is inexpensive and easy to use (thought not to build, perhaps) impacted the success of each project and the sustainability of relationships between researcher and organization. Ultimately, this research shows the continuing gap between the promise of online geospatial technology, and the realities of its implementation at the community level.

Reference: Johnson, Peter A, Jon Corbett, Christopher Gore, Pamela J Robinson, Patrick Allen, and Renee E Sieber. A web of expectations: Evolving relationships in community participatory geoweb projects. ACME: An International E-Journal for Critical Geographies, 2015, 14(3), 827-848.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Crosspost: Being Philosophical About Crowdsourced Geographic Information

This Geo: Geography and Environment blog post is cross-posted with permission from the authors, Renée Sieber (McGill University, Canada) and Muki Haklay (University College London, UK).
By Renée Sieber and Muki Haklay

Our recent paper, The epistemology(s) of volunteered geographic information: a critique, started from a discussion we had about changes within the geographic information science (GIScience) research communities over the past two decades. We’ve both been working in the area of participatory geographic information systems (GIS) and critical studies of geographic information science (GIScience) since the late 1990s, where we engaged with people from all walks of life with the information that is available in GIS. Many times we’d work together with people to create new geographic information and maps. Our goal was to help reflect their point of view of the world and their knowledge about local conditions, not always aim for universal rules and principles. For example, the image below is from a discussion with the community in Hackney Wick, London, where individuals collaborated to ensure the information to be captured represented their views on the area and its future, in light of the Olympic works that happened on their doorstep. The GIScience research community, by contrast, emphasizes quantitative modelling and universal rules about geographic information (exemplified by frequent mentioning of Tobler’s first law of Geography). The GIScience research community was not especially welcoming of qualitative, participatory mapping efforts, leaving these efforts mostly in the margins of the discipline.

Photo of 2007 participatory mapping contributors working together in Hackney Wick, London, 2007

Participatory Mapping in Hackney Wick, London, 2007

Around 2005, researchers in GIScience started to notice that when people used their Global Positioning System (GPS) devices to record where they took pictures or used online mapping apps to make their own maps, they were generating a new kind of geographic information. Once projects like OpenStreetMap and other user-generated geographic information came to the scene, the early hostility evaporated and volunteered geographic information (VGI) or crowdsourced geographic information was embraced as a valid, valuable and useful source of information for GIScience research. More importantly, VGI became an acceptable research subject, with subjects like how to assess quality and what motivates people to contribute.

This about-face was puzzling and we felt that it justified an investigation of the concepts and ideas that allowed that to happen. Why did VGI become part of the “truth” in GIScience? In philosophical language, the questions ‘where does knowledge come from? how was it created? What is the meaning and truth of knowledge?’ is known as epistemology and our paper evolved into an exploration of the epistemology, or more accurately the multiple epistemologies, which are inherent in VGI. It’s easy to make the case that VGI is a new way of knowing the world, with (1) its potential to disrupt existing practices (e.g. the way OpenStreetMap provide alternative to official maps as shown in the image below) and (2) the way VGI both constrains contributions (e.g., 140 chars) and opens contributions (e.g., with its ease of user interface; with its multimedia offerings). VGI affords a new epistemology, a new way of knowing geography, of knowing place. Rather than observing a way of knowing, we were interested in what researchers thought was the epistemology of VGI. They were building it in real-time and attempting to ensure it conformed to existing ways of knowing. An analog would be: instead of knowing a religion from the inside, you construct your conception of it, with your own assumptions and biases, while you are on the outside. We argue that construction was occurring with VGI.

OpenStreetMap mapping party (Nono Fotos)

OpenStreetMap mapping party (Nono Fotos)

We likewise were interested in the way that long-standing critics of mapping technologies would respond to new sources of data and new platforms for that data. Criticism tends to be grounded in the structuralist works of Michel Foucault on power and how it is influenced by wider societal structures. Critics extended traditional notions of volunteerism and empowerment to VGI, without necessarily examining whether or not these were applicable to the new ‘ecosystem’ of geospatial apps companies, code and data. We also were curious why the critiques focussed on the software platforms used to generate the data (e.g., Twitter) instead of the data themselves (tweets). It was as if the platforms used to create and share VGI are embedded in various socio-political and economic configurations. However, the data were innocent of association with the assemblages. Lastly, we saw an unconscious shift in the Critical GIS/GIScience field from the collective to the personal. Historically, in the wider field of human geography, when we thought of civil society mapping together by using technology, we looked at collective activities like counter-mapping (e.g., a community fights an extension to airport runway by conducting a spatial analysis to demonstrate the adverse impacts of noise or pollution to the surrounding geography). We believe the shift occurred because Critical GIS scholars were never comfortable with community and consensus-based action in the first place. In hindsight, it probably is easier to critique the (individual) emancipatory potential as opposed to the (collective) empowerment potential of the technology. Moreover, Critical GIS researchers have shifted their attention away from geographic information systems towards the software stack of geospatial software and geosocial media, which raises question about what is considered under this term. For all of these reasons and more we decided to investigate the “world building” from both the instrumentalist scientists and from their critics.

We do use some philosophical framing—Borgmann has a great idea called the device paradigm—to analyse what is happening, and we hope that the paper will contribute to the debate in the critical studies of geographical information beyond the confines of GIScience to human geography more broadly.

About the authors: Renée E. Sieber is an Associate Professor in the Department of Geography and the School of Environment at McGill University. Muki Haklay is Professor of Geographical Information Science in the Department of Civil, Environmental and Geomatic Engineering at University College London.

Call for Papers (Book): Geoweb Policy, Law, and Ethics

Geothink_Logo_iTunes

Hello everyone,

We are putting together a draft prospectus for consideration by the University of Ottawa Press for their Law, Technology and Media book series. The edited volume will focus on the legal, policy, regulatory, and ethical issues arising from the geoweb. Anticipated issues that the volume will cover include privacy, surveillance, IP, licensing, open data, the public/private divide, citizen engagement, and governance.

We seek brief expressions of interest for chapters. Please send to both Elizabeth and Leslie, by August 30, a chapter title and a short (150-200 word) abstract for our consideration.

Thank you,
Leslie and Elizabeth

Crosspost: Green Cities and Smart Cities: The potential and pitfalls of digitally-enabled green urbanism

The Vancouver Convention Centre in Vancouver, BC, Canada was the world's first LEED Platinum-certified convention center. It also has one of the largest green roofs in Canada. Image Credit: androver / Shutterstock.com

The Vancouver Convention Centre in Vancouver, BC, Canada was the world’s first LEED Platinum-certified convention center. It also has one of the largest green roofs in Canada. Image Credit: androver / Shutterstock.com

This post is cross-posted with permission from Alexander Aylett, from UGEC Viewpoints. Aylett is an Assistant Professor at the Centre on Urbanisation, Culture and Society at the National Institute for Scientific Research (UCS-INRS) in Montreal, Quebec.

By Alexander Aylett

Since its early days, the discourse around “smart cities” has included environmental sustainability as one of its core principles. The application of new digital technologies to urban spaces and processes is celebrated for its ability to increase the well-being of citizens while reducing their environmental impacts. But this engagement with sustainability has been limited to a technocratic focus on energy systems, building efficiency, and transportation. It has also privileged top-down interventions by local government actors. For all its novelty, the smart cities discussion is operating with a vision of urban sustainability that dates from the 1990s, and an approach to planning from the 1950s.

This definition of “urban sustainability” overlooks key facets of a city’s ecological footprint (such as food systems, resource consumption, production related greenhouse gas emissions, air quality, and the urban heat island effect). It also ignores the ability of non-state actors to contribute meaningfully to the design and implementation of urban policies and programs. But that doesn’t need not be the case. In fact, if employed properly, new information technologies seem like ideal tools to address some of urban sustainability’s most persistent challenges.

Progress and Lasting Challenges in Local Climate Governance

Let’s take a step back. Often discussions of smart cities begin with an account of the capabilities of specific technologies or interfaces and then imagine urbanism – and urban sustainability – through the lense of those technologies. I’d like to do the opposite: beginning with the successes and lasting challenges faced by urban sustainability and interpreting the technologies from within that context. To understand the role that “smart” technologies could play in enabling sustainable cities, it’s useful to first look at what we have managed to accomplish so far, and what still needs to be done.

For those of us working on sustainable cities and urban responses to climate change, the past two decades have been a period of both amazing successes and enduring challenges. In the early 1990s a handful of cities began promoting the (at that time) counterintuitive idea that local governments had a key role to play in addressing global climate change. Since then, the green cities movement has won significant discursive, political, and technical battles.

Global inter-municipal organizations like ICLEI or the C40 now have memberships that represent thousands of cities. Two decades of work have created planning standards and tools and an impressive body of “best practice” literature. Through the sustained efforts of groups like ICLEI, cities are now recognized as official governmental stakeholders in the international climate change negotiations coordinated by the United Nations.

But – crucially – real urban emissions reductions are lagging well below what is needed to help keep global CO2 within safe limits. Looking at the efforts of individual cities and the results of a global Urban Climate Change Governance survey that I conducted while at MIT (Aylett 2014, www.urbanclimatesurvey.com ) shows why. Apart from a small contingent of charismatic cities like Vancouver, Portland, or Copenhagen, cities are struggling to move beyond addressing the “low hanging fruit” of emission from municipal facilities ( i.e., vehicle fleet, municipal buildings, street lighting – known as “corporate emissions”) to taking action on the much more significant emissions generated by the broader urban community (i.e., business, industry, transportation, and residential emissions).

This problem has been with us since the early days of urban climate change responses. But how we understand it has changed significantly. Where some cities used to inventory only their corporate emissions, this is now rare. Current guidelines cover community-wide emissions and work is underway to create a global standard for emissions inventories that will also engage with emissions produced in the manufacture of the goods and services consumed within cities (see Hoornweg et al. 2011).

Built on the increased scope of our technical understanding of urban emissions, is a change in how we understand the work of governing climate change at the local level. A top-down vision of climate action focused on the regulatory powers of isolated local government agencies is being replaced by one that is horizontal, relational, and collaborative. This approach transforms relationships both inside and outside of local governments, by linking together traditionally siloized municipal agencies and also forging partnerships with civil-society and business actors (Aylett 2015).

The increased prominence of non-state actors in urban climate change governance has led to growing calls for partnerships across the public-private divide (Osofsky et al. 2007; Andonova 2010; Bontenbal and Van Lindert 2008). These partnerships play an important role in overcoming gaps in capacity, translating the climate change impacts and response options into language that is meaningful to different groups and individuals, and accelerating the development of solutions. Follow-up analysis of the 2014 MIT-ICLEI Climate survey shows that these partnerships have an important positive impact on the scope of concrete emissions reductions. Cities with stronger partnerships appear to be more able to create concrete emissions reductions outside of areas directly controlled by the municipality.

The street car in Portland, Oregon, USA. Image Credit: Shutterstock.com

The street car in Portland, Oregon, USA. Image Credit: Shutterstock.com

This evolution in approaches to climate change planning follows a broader current in urban planning more generally which, since the 1960s have moved away from expert driven and technocratic processes and created increasing amounts of space for participatory processes and facilitative government.

In a nutshell, an increasingly complex and holistic technical understanding of urban emissions is being matched by an increasing horizontal and networked approach to governing those emissions. (A similar shift is taking place in the more recent attention to urban adaptation and resilience.)

But plans and programs based on this understanding quickly run into the significant barriers of institutional siloization and path dependency, a lack of effective information sharing, challenges of data collection and analysis, and difficulty mobilizing collective and collaborative action across multiple diverse and dispersed actors (Aylett 2014). The strength of collaborative multi-stakeholder responses is also their weakness. While effective climate change action may not be possible without complex networks of governance, coordinating these networks is no simple task. The subject of urban climate change governance has been the focus of an expanding body of research (Aylett 2015, 2014, 2013; Betsill & Bulkeley 2004, 2007; Burch 2010; Burch et al. 2013; Romero-Lankao et al. 2013.)

“Smart” Urban Climate Governance

Seen from this perspective, the allure of “smart” approaches to green cities is precisely the fact that information technology tools seem so well suited to the challenges that have stalled progress so far. Collecting, sharing and analysing new and existing data, and coordinating complex multi-scalar social networks of collaborative design and implementation are precisely what has drawn attention to new technologies in other sectors.

Disappointingly, current applications of a data-driven and technologically enabled approach to urban sustainability are far from delivering on this potential. Reading through the literature shows that the many interesting works that address the impacts of new technologies on urban governance (for example Elwood 2010, Evans-Cowley 2010, Goldsmith and Crawford 2015, Moon 2002) have nothing to say about the governance of urban sustainability. Work that does address environmental sustainability is dominated by a technocratic focus on energy systems, building efficiency, and transportation that privileges top-down action by municipal experts and planning elites (The Climate Group 2008, Boorsma & Wagener 2007, Kim et al. 2009, Villa & Mitchell 2009). This literature review is ongoing, and I continue to hope to find a body of work that combines a developed understanding of urban sustainability with a detailed reflection on digital governance. As it is, we seem to be working with outdated approaches to both urban sustainability and planning.

An off-shore wind farm near Copenhagen, Denmark. Image Credit: Shutterstock.com

An off-shore wind farm near Copenhagen, Denmark. Image Credit: Shutterstock.com

How to update this approach, and use the full potential of data-driven, technologically enabled, and participatory approaches to spur accelerated transitions to sustainable cities is a key question. This research is necessary if we are going to unlock the full potential of the “smart” urbanism to address the necessity of building sustainable cities. It is also important that we avoid rolling back the clock on two decades of “green cities” research by basing our digital strategies around outdated understandings of the urban sustainability challenge.

Conclusions

Cities are responsible for as much as 70% of global greenhouse gas emissions and consume 75 percent of the world’s energy (Satterthwaite 2008). These figures are often repeated. But taking action at that scale requires both technological and socio-institutional innovations. Efforts to reduce urban emissions are challenged by the complexity of coordinating broad coalitions of action across governmental, private, and civil-society actors, and the need to effectively collect, share, and analyse new and existing data from across these traditionally siloized sectors.

These complexities have played an important role in limiting actual urban emissions reductions far below what is needed to stabilize global emissions within a safe range. Interestingly, these complexities are also the very strengths of emerging information and communications technologies (ICT) tools and Geoweb enabled approaches to urban planning and implementation. Currently, the use of “smart” approaches to address the urban climate challenge has been limited to narrow and technocratic initiatives. But much more is possible. If effective bridges can be built between the ICT and Urban Sustainability sectors, a profound shift in approaches to the urban governance of climate change could be possible. It is important to increase both sustainability and digital literacy among those involved. Only then will innovations in urban sustainability benefit from a deep understanding of both the new tools at our disposal, and the complex challenge to which we hope to apply them.

(A previous version of this was presented as part of the Geothink pre-event at the 2015 American Association of Geographers conference in Chicago. IL. See: www.geothink.ca)

Alexander Aylett is Assistant Professor at the Centre on Urbanisation, Culture and Society at the National Institute for Scientific Research (UCS-INRS) in Montreal, Quebec, Canada.

A First Hand Account of McGill University’s Team-CODE’s Experiences in the 1st Annual ECCE App Challenge hosted by Environmental Systems Research Institute (ESRI)

By Jin Xing

I was one of three Geothink students who competed in the Environmental Systems Research Institute’s (ESRI) ECCE 1st Annual App Challenge hosted by the institute’s Canada Centre of Excellence. Team CODE-McGill, which consisted of McGill University students Matthew Tenney, Carl Hughes, and myself placed second in the competition that concluded on March 20 with the announcement of a winning group from the University of Waterloo.

Although our three team members each has a different research interest, each of us studies topics related to open data. Our Community Open Data Engage (CODE) application was sparked by an exchange I had with Hughes when we discovered we both call Toronto, Ontario home after the competition had already begun. In fact, it was only after Hughes told me that my neighbourhood was “a better” place to live that we began to interrogate the question of how to evaluate a community using open data.

As we worked on our submission, we noticed that community-level open data attracts more attention than data on the whole city. In particular, we found citizens were more concerned with data on traffic, education, and recreation resources in their own neighbourhoods compared to other types of data. Our creation: A new approach for exploring a community using an open data platform that connects people and communities.

However, the application that we designed required a number of trade-offs to be decided in the span of only one week. First, we struggled to choose whether to include more data or to favour an easy-to-use interface. In particular, we wanted to develop functionality to integrate a greater variety of community data but didn’t want the application to become too hard to use. After several hours of discussion, we decided to favour an approach that centered on making open data “easy and ready to use.”

The second trade-off involved the selection of ESRI JavaScript APIs. In particular, we had to choose ESRI ArcGIS API or ESRI Leaflet for open data integration and visualization. At the beginning, I preferred the ArcGIS API due to its rich functions. But Tenney pointed out it was actually over-qualified and may delay the page loading which caused the team to decide to use Leaflet.

Finally, we had to decide how to integrate social media. In particular, we needed to decide whether the Twitter content should be loaded from data streaming or just retrieved from the back-end. All of us felt it was cool to have a real-time Twitter widget on our application’s page, but we didn’t know how to get it to choose the right tweets. For example, a user named Edmonton might say nothing about the City of Edmonton city, and our code would have needed to filter it out in real-time. Considering the difficulty of developing such a data filtering AI in one week, we decided to include it only on the back-end. To accomplish this, we used Python to develop a way to harvest and process data, while the ESRI Leaflet handled all the front-end data integration and visualization.

Our application included data on school locations, health facility locations, grocery store locations, gas station locations, green space, cultural facilities, emergency services, census dissemination areas and Twitter data, all of which were presented as different map layers. We employed the Agile developing method for CODE, meaning we quickly built the prototype for CODE and tested it, then repeated this process with additional functions or by re-developing bad code.

In actuality, though, we built three prototypes in the first two days and spent another two days for testing, selecting and re-developing. The Agile method helped us keep CODE always functional and easy to extend. The only drawback of using Agile was the local code synchronization become necessary before we pushed it to GitHub. If two of us pushed at the same time with different code, our GitHub would be massed up. By late Thursday night, we had nearly finished all the planned coding and had even begun to improve the user experience. The search widget and navigation buttons were added in the last day to make open data easy and ready for use in our CODE application.

We felt that by putting information in the hands of concerned citizens and community leaders, CODE is a proof-of-concept for data-driven approaches to building a strong sense of communityacross cities. CODE also connects people and governments by allowing them to create forums for conversation in specific communities across a city or search social media sites to find other people in their area.

Furthermore, by integrating and visualizing open data at a community scale, CODE demonstrates a new approach for community exploration. In fact, users can search and select different open data for certain communities on one map, and corresponding statistics are shown as pop-ups. In the initial phase, we provided this community exploration service for citizens in Edmonton and Vancouver.

Overall, I felt attending this ECCE App challenge was a great experience to integrate ESRI web technologies with open data research. It proves open data can act as the bridge between citizen and cities, and that ESRI products significantly simplify the building of just such a bridge. We believe more applications will continue to be inspired by the ECCE App challenge and that open data will become closely used in everyday life. Thanks to ESRI, we got a chance to help shape the next-generation of community exploration.

If you have thoughts or questions about this article, get in touch with Jin Xing, Geothink’s Information Technology Specialist, at jin.xing@mail.mcgill.ca.

Spotlight on Recent Publications: Exploring the Hyperbole Behind Crisis Mapping Through Chronic Community Development Issues

By Drew Bush

McGill University Masters Student Ana Brandusescu, lead author on the paper "Confronting the hype: The use of crisis mapping for community development."

McGill University Masters Student Ana Brandusescu, lead author on the paper “Confronting the hype: The use of crisis mapping for community development.”

In a paper published this month, Geothink researchers critically examined the role that crisis mapping software such as Crowdmap can play when used to instead facilitate development issues in three Canadian communities in Vancouver and Montreal. They argue that such platforms hold many technological constraints, including an intrinsic short-term feel that makes it difficult to deploy on the chronic, long-term issues common to community development.

Entitled “Confronting the hype: The use of crisis mapping for community development,” the paper was published in Convergence: The International Journal of Research into New Media Technologies by McGill University Masters Student Ana Brandusescu and Associate Professor Renee Sieber along with Université du Quebec à Montréal Professor Sylvie Jochems. Please find the abstract below.

Each of the case studies examined in the paper involved a different set of circumstances. In Montreal, the researchers worked with a community of low-income immigrants in single-family homes who predominantly spoke French. In contrast, one community in Vancouver consisted of young middle-class families living in subsidized student housing while the other was an ethnically diverse low-income community living in rented housing. Both Vancouver communities predominantly spoke English.

“The Vancouver cases had issues resembling crises, for example, immediate rezoning, antidensification, and loss of social housing,” the researchers wrote in the paper. “The Montreal organizers wished to address longer term issues like the recording of community assets.”

In each community, the researchers prepared participants at initial community meetings by using storyboards or comic books to explain the process of mapping. Furthermore, a manual they created helped application managers and community members understand how to manage the application, submit reports (via texts, tweets, Web reports, e-mails and smartphone message), geolocate reports, and handle messages that might contain personal identifiers or foul language. In Vancouver, the managers consisted of community activists while in the Montreal case the managers were part-time professional community organizers.

Although each community differed in their implementation of the mapping software and program, the findings were striking.

“In this article, we explored the reality behind the hype of crisis mapping and revealed that hype through its repurposing to community development,” they write in their conclusion. “We confronted the zero-cost argument and found numerous technology constraints, which confirmed the challenges of introducing a new technological medium to community development processes.”

“Burns asserted that knowledge politics concerns the role of power in developing a map but the politics also refers to the overall hype to which we so easily succumb,” they add later in their conclusion in reference to a paper by a researcher at the University of Washington, Ryan Burns, entitled Moments of closure in the knowledge politics of digital humanitarianism. “If we acknowledge and then work past the hype then perhaps we will achieve more meaningful and sustainable systems.”

Abstract
Confronting the hype: The use of crisis mapping for community development
This article explores the hyperbole behind crisis mapping as it extends into more long term or ‘chronic’ community development practices. We critically examined developer issues and participant (i.e. community organization) usage within the context of local communities. We repurposed the predominant crisis mapping platform Crowdmap for three cases of community development in Canadian anglophone and francophone. Our case studies show mixed results about the actual cost of deployment, the results of disintermediation, and local context with the mapping application. Lastly, we discuss the relationship of hype, temporality, and community development as expressed in our cases.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Spotlight on Recent Publications: Teresa Scassa at the Intersection of Intellectual Property Rights and Municipal Transit Data

By Drew Bush

faculty_olympics

Teresa Scassa is Canada Research Chair in Information Law at the University of Ottawa.

This story was originally reported on Teressa Scassa’s personal blog which you can find here.

In a paper just published in the Fordham Urban Law Journal, Geothink researcher Teresa Scassa argues that the actual laws governing intellectual property (IP) rights are often surprisingly irrelevant in disputes over rights to municipal transit data. Instead, she finds that being in a position to make a claim to IP rights is often more important than actually having a good claim.

“How people decide to interact with each other is more important than what their precise legal rights might be,” Scassa, the Canada research chair in information law at University of Ottawa, wrote in an e-mail to Geothink.ca. “Often, to understand the precise boundaries of those rights it is necessary to litigate and one or both parties may lack the resources to go to court. So, in those circumstances, parties may reach an understanding of how they will set the boundaries of their relationships.”

Her paper, entitled Public Transit Data Through an Intellectual Property Lens: Lessons About Open Data, examines some of the challenges presented by the transition from ‘closed’ to open data within the municipal context. She completed the paper as part of a Geothink project examining open data in a concrete context that’s particular to municipalities.

“In the municipal transit data context, there was generally an imbalance of resources between developers and municipalities, and there was little desire on either part to go to court,” she added. “Nevertheless, in the early days, municipal transit authorities asserted their IP rights using cease and desist letters. This assertion of IP rights was met with arguments about the need for open data, and eventually compromises were reached around open data that shifted over time, and varied from one municipality to another.”

In the paper, she examines how these legal developments have impacted the use of real-time transit data by developers seeking to make use of this data in digital applications and corporations hoping to add value to products and services they offer. In particular, the paper covers three types of data: 1) Route maps; 2) Static data (such as bus timetables that only change seasonally); 3) And, real-time GPS data generated by units installed on transit vehicles.

A number of municipalities exerted their IP rights over such data because of concerns that ranged from ensuring its quality and authenticity to preserving the ability to make data available on a cost-recovery basis.

“The emerging open data movement shifted some of these concerns and created a new set of expectations and practices around open municipal transit data,” she wrote in her e-mail. “As data become more complex (with the advent of real-time GPS data, for example) the IP issues shifted and changed again, raising new questions about open data in this context. This is where the next phase of my research will take me.”

To find out more about Teresa Scassa’s work, visit her personal blog here or follow her on Twitter @teresascassa. For more on IP, check out another of her recent papers (written with Univeristy of Ottawa doctoral student Haewon Chung) that analyzes various types of volunteer citizen science activities to determine whether they raise legal questions about IP ownership.

Find a link to the article along with its abstract below.

Public Transit Data Through an Intellectual Property Lens: Lessons About Open Data

This paper examines some of the challenges presented by the transition from ‘closed’ to open data within the municipal context, using municipal transit data as a case study. The particular lens through which this paper examines these challenges is intellectual property law. In a ‘closed data’ system, intellectual property law is an important means by which legal control over data is asserted by governments and their agencies. In an ‘open data’ context, the freedom to use and distribute content is a freedom from IP constraints. The evolution of approaches to open municipal transit data offers some interesting examples of the role played by intellectual property at every stage in the evolution of open municipal transit data, and it highlights not just the relationship between municipalities and their residents, but also the complex relationships between municipalities, residents, and private sector service providers.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.