Category Archives: Geothink News

GIS on Campus: Join Claus Rinner for GIS Day at Ryerson

By Naomi Bloch


This Wednesday, November 18, marks the 16th annual GIS Day. Throughout the week, Geothink will present a series of posts looking at some of the ways in which our collaborators, partners, and friends around the world are critically examining and using GIS as a tool for civic engagement and understanding.
The community snapshots presented this week highlight diverse perspectives and uses for GIS. 

If you’re looking for a way to introduce friends to the wide-ranging sphere of GIS, look no further than Toronto’s Ryerson University campus on Wednesday.

Geothink’s Claus Rinner along with GIS and Map Librarian Dan Jakubek have a full afternoon of events scheduled for GIS Day. They’ve lined up three keynote presentations, each of which will explore very different GIS applications: Senior Landscape Ecologist Dr. Namrata Shrestha will discuss her work with the Toronto & Region Conservation Authority; Andrew Lyszkiewicz from the City of Toronto’s Information & Technology Division brings in the municipal GIS perspective; while the Toronto Star’s Matthew Cole and William Davis are on hand to cover the growing role of GIS, mapping, open data, and data analysis in the media.

Apart from keynotes, there will be a poster session, geovisualization project displays, as well as several practical demonstrations of GIS and geoweb tools in action. Neptis Foundation, a Geothink partner, is one of the participating organizations. According to the Neptis Foundation’s Adrien Friesen, he and colleague Vishan Guyadeen will be demonstrating their soon-to-be-launched geoweb platform, “an integrative web mapping tool for the Greater Golden Horseshoe, created to help residents, researchers and decision makers better understand what shapes our urban and rural environments. It allows users to select different spatial layers that they can overlay and view different infrastructure, political boundaries, and protected areas (among many other things), to visualize the region in which they live.”

A full itinerary of the afternoon’s events can be found on the Geospatial Map & Data Centre website. While you’re on campus, you might want to check out the Geospatial Map & Data Centre itself. Ryerson Library’s communal lab is a dedicated space designed to support collaborative work with GIS, data, and related geospatial and statistical software packages.

Date: Wednesday, November 18, 2015
Time: 1:00 pm–5:00 pm
Location: Library Building, LIB-489, 4th Floor, 350 Victoria Street

For more of Geothink’s GIS Day coverage, see:

If you have thoughts or questions about this story, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

 

GIS in the City: Toronto on the Map

Map of Toronto school nutrition program distributionBy Naomi Bloch


This Wednesday, November 18, marks the 16th annual GIS Day. Throughout the week, Geothink will present a series of posts looking at some of the ways in which our collaborators, partners, and friends around the world are critically examining and using GIS as a tool for civic engagement and understanding.
The community snapshots presented this week highlight diverse perspectives and uses for GIS. 

If you make your way to Toronto’s City Hall on Wednesday, you’ll discover that the City is displaying a slideshow of some pretty interesting maps, just above the 3D model of the downtown core. The big-screen projection, set up for Geography Awareness Week and GIS Day, showcases some of the ways that Toronto’s municipal divisions are using GIS to tackle urban issues.

These same projects can also be perused at your leisure via a new City of Toronto GIS Day landing page. The City of Toronto is one of Geothink’s municipal partners.

City of Toronto's Ventilation Index map. All images © City of Toronto 1998-2015. Used with permission.

City of Toronto’s Ventilation Index map. All images © City of Toronto 1998-2015. Used with permission.

The breadth of projects may surprise Toronto residents. Take the Environment and Energy Division’s (EED) Ventilation Index. The EED is mapping the city to get a full spatial understanding of areas known as “urban canyons,” narrow open spaces confined between built spaces that contribute to ground-level pollution — a problem most noticeable to pedestrians in the city. The higher the ventilation index, the greater the level of pollution. The Ventilation Index researchers’ goal is to try to identify “how the City’s changing building profile may have growing adverse effects at ground level and how to mitigate and adapt to the rapidly growing City.”

The City’s uses of GIS extend beyond urban planning or trying to understand how urban design impacts the environment. The Department of Public Health, for example, is employing GIS to better understand the distribution and funding of school nutrition programs. In 2014, 160,000 Toronto students participated in breakfast, snack and lunch programs, provided in schools and community sites throughout the city. Public Health has mapped the locations where student nutrition programs were in effect in 2014, with each site categorized based on the funding source behind the program. Incorporating census data such as the Low Income Measure, they’ve identified where meal programs are located relative to the City’s designated Neighbourhood Improvement Areas. This spatial mapping of school nutrition programs helps the City to understand which communities remain under-served and where additional funding should be allocated.

Want to know more about how the City of Toronto is using GIS? Head to Metro Hall on Friday, Nov. 20, anytime between 8:00 am and 3:00 pm. A few staff members from the City’s Geospatial Competency Centre will be on hand to answer questions.

For more of Geothink’s GIS Day coverage, see:

If you have thoughts or questions about this story, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

 

GIS in the Classroom: Geography and the Law

GIS DayBy Naomi Bloch


This Wednesday, November 18, marks the 16th annual GIS Day. Throughout the week, Geothink will present a series of posts looking at some of the ways in which our collaborators, partners, and friends around the world are critically examining and using GIS as a tool for civic engagement and understanding.
The community snapshots presented this week highlight diverse perspectives and uses for GIS. 

Tenille Brown headshot

Tenille Brown, Ph.D. candidate in the University of Ottawa’s Faculty of Law

In the winter 2016 term, Geothink’s Tenille Brown, a Ph.D. candidate in the University of Ottawa’s Faculty of Law, will be teaching a new course called Law and Geography. The seminar course will be offered as a first-year elective option for J.D. students. “It’s really exciting because it will be the first law and geography course in a Faculty of Law in Canada that I am aware of,” said Brown.

The intention of the course is to introduce new law students to the emerging field of legal geography, which focuses on spatial and place-based aspects of law and legal regulation. The course description highlights several focus areas, including public and private spaces; property and the city; critical perspectives of identity, racism and the law; gender, property and the law; indigenous peoples and the environment; and globalization. “There’s a wide variety of topics,” said Brown, “and within that I have a couple of classes which will look at issues of GIS and a lot of the themes of Geothink in relation to legal geography scholarship and in relation to the law.”

Brown notes that GIS is addressed in the legal literature to some extent, but such discussion is in its nascent stages. For example, the field of technology law deals with liability issues in relation to GIS, and issues such as copyright and privacy. “And there’s a little bit of GIS analysis in relation to understanding crime, and criminality,” said Brown. “That’s a big area of research, but I think there are many, many, many GIS narratives which are not captured at all.”

All of these GIS-oriented legal issues will play a role in her course, however she’s also hoping to draw in some students who have previous practical experience with GIS technologies. “If there are students who have a particular interest in GIS or have skills in GIS, and they’re willing, then we can explore not just legal liability in relation to GIS but also, how can we use GIS to help the functioning of the legal system? So really opening it up for those skills to be brought into the classroom.”

“I’m interested in knowing how information about a place, which is maybe more than property-related, can influence how we regulate or understand a particular area of a city, for example,” Brown said. “How can we bring in different information about a city that is not captured by a property title deed, or a traditional survey that we might have? We see a lot of non-traditional information collection right now. That is, it’s non-traditional from a legal perspective — information about how people use a place. Typically the law doesn’t care about that. Typically the law just wants to know who has the title deed, and that’s it.”

Brown offers the example of First Nations groups in Canada, who are currently using GIS and GIS technologies to collect oral histories and map out their histories spatially. “There’s a big push from indigenous communities, and a willingness and a desire to engage with GIS technologies to capture these different narratives,” Brown said. “And they’re wanting to use it to support land claims. That’s their whole aim.

“So it’s important to figure out how modern information can be incorporated into a legal system which relies on historical treaties,” Brown explained. “There’s a lot of legal questions about using that information and the strength of that kind of information from an evidentiary perspective. The law has a very non-GIS approach — a non-tech approach — to adjudication. So I think one of the really important questions is, how can we get this modern GIS counter-narrative and make sure that it’s solid as evidence that is effective for the legal system?”

For Brown, encouraging students with a GIS or geography background to consider how their knowledge can contribute to the legal process is just one motivation for her course. “They’re first-year law students,” Brown said,  “so they’re just beginning to get to grips with what takes place in the Faculty of Law. They’re in shock a little bit, at this point. With this class, I’m really hoping to open it up for students that already have an undergraduate degree in something spatial-related. If there’s anyone who’s done work with GIS, that will definitely enrich the classes.”

Do you have questions about Tenille’s course or research? Contact her on Twitter at: @TenilleEBrown 
Tenille Brown is a PhD candidate in the Faculty of Law at the University of Ottawa. She is a Geothink student member, and a member of the university’s Human Rights Research and Education Centre. Her research is in the areas of legal geography, including property, spatial and citizen engagement in the Ottawa context.


For more of Geothink’s GIS Day coverage, see:

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Moving Forward with Canadian Census Data

By Naomi Bloch


Chloropleth maps of National Household Survey global non-response data at the dissemination-area level courtesy Scott Bell. Global non-response rates > 50% resulted in suppression of data for that spatial unit. All maps are classified using a quantile classification scheme.


As we move forward (and backward) with the 2016 return of Canada’s long-form census, questions remain for everyone who uses Statistics Canada’s key socio-economic data. Researchers, local government agencies, community organizations, and industry will still need to use data collected via the 2011 National Household Survey and understand how to reconcile that information with long-form census data.

Concerns regarding the reliability of NHS data stem from the lower response rates that resulted from the non-mandatory nature of the 2011 survey. The overall response rate for the survey decreased from 94 percent in 2006 to 69 percent in 2011. Media attention has centred on the fact that Statistics Canada chose not to release survey data for 25 percent of all census subdivisions because response rates for those spatial units were too low. A key question is whether the regions for which we have no reliable data share certain socio-economic characteristics — and if so, how this might impact service provision.

Geothink co-applicant researcher Scott Bell, a professor of Geography and Planning at University of Saskatchewan, has been studying and mapping the spatial patterns of the National Household Survey’s global non-response rates. His work examines various geographic levels, and considers response rate patterns relative to several socio-economic variables. Bell found that across the 15 cities he studied, there are many commonalities between areas where response rates are similar.

In this video interview, Bell discusses his research and its implications.

For more from Scott Bell, see also: The Long-term Impacts of the Short-Lived National Household Survey

If you have thoughts or questions about this video interview, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

The Long-term Impacts of the Short-lived National Household Survey

Tweet

By Naomi Bloch

On November 5, Navdeep Bains, Canada’s new Minister of Innovation, Science and Economic Development (now that’s a mouthful!) confirmed the rumours that the country’s mandatory long-form census will be reinstated in 2016.

But what are the long-term consequences of the interruption in mandatory data collection caused by 2011’s National Household Survey (NHS)? How significant is this short-lived census change likely to be?

Geothink co-applicant researcher Scott Bell, a professor of Geography and Planning at University of Saskatchewan, has been studying and mapping the spatial patterns of the voluntary National Household Survey data, comparing global non-response rates in metropolitan and non-metropolitan areas across the country. After today’s official announcement, Bell shared a few preliminary thoughts based on his research.

profile photo of Scott Bell

Scott Bell is a professor in the Department of Geography and Planning at the University of Saskatchewan.

Geothink: Now that the mandatory long-form census has been re-instated, is there anything that researchers or others who rely on location-based specifics from census data need to keep mind?

Scott Bell: In my own research I have been relying on 2006 data for much longer than I would have if the 2011 survey had been the long form of the census. The NHS misrepresents different parts (and types of parts) of the country. In my analysis of 15 Canadian cities, there were lower response rates (measured by non-response) in places with low income, aboriginal populations, new immigrants, and lower rates of education. This is quite troubling since the only solution Stats Canada had at their disposal was over sampling in such areas, which might exacerbate the bias.

Geothink: Did you find that your own recent research was impacted by the 2011 data, and are there likely to be any long-term implications for researchers, given that just one survey period was affected?

Scott Bell: Yes, I was compelled to use long-form data from 2006. It is a relief that we will have a return of this data for 2016. I have always appreciated Canada’s five-year census cycle and a 10-year wait is going to be OK, this once. But there will be a persistent problem trying to understand our society between 2011 and 2016 that won’t be true of another five-year period. Our understanding of economics, household mobility, finances, and structure, immigration, education, etc. for the period from 2011 to 2016 is diminished.

Geothink: Are there any important considerations to keep in mind, for those integrating data from 2011 and other periods?

Scott Bell: In work I hope to publish in the next six months, patterns of response (actually non-response) and what social and economic variables predict this non-response will be elucidated. The next step might be the development of tools to adjust NHS values in order to make the data collected more reliable. The most important step in this direction will be the collection of the long form in 2016; that data will be useful in establishing estimates of what 2011 values are valid and perhaps allow for the setting of “correction factors” for egregious rates of non-response.

Stay tuned for more detailed insights from Scott Bell on location-specific considerations of the National Household Survey data, coming soon to Geothink.ca.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Journalism: Storytelling in the Geodata Age

By Naomi Bloch

The rise of more accessible geospatial web tools along with expanding sources of open data have fostered a potent—if somewhat techno-utopian—civic vision. For those immersed in understanding this new digital landscape, one question often surfaces: who’s truly putting these resources to use?

The most reliable answer is perhaps an obvious one. “Journalists are making huge use of mapping and geodata for storytelling, for the visualization of stories, and for investigative reporting purposes,” said April Lindgren, an associate professor at Ryerson University’s School of Journalism and founding director of the Ryerson Journalism Research Centre, a Geothink partner organization.

As a scholar, Lindgren’s own research employs data mapping techniques to examine the geography of news coverage and the role of Canadian media in society. “Maps have actually been quite a powerful tool for us to explore patterns of local news and understand how it works. It opened up a whole new way of getting at and understanding the data because we were able to visualize it.

“Before that, it was the old problem of columns and reams of numbers” Lindgren said. “But being able to map it allowed us to show geographically, yes, most of the news coverage is focused on downtown Toronto. So why is that? And what are the implications of not doing much coverage in other areas of the city? And furthermore, we mapped the types of topics. So what does it mean when most of the news that they publish about certain areas is crime coverage? What does that do in terms of the geographic stereotyping?”

Computer-assisted reporting revisited

Lindgren notes that the use of mapping and data analysis for actual journalistic purposes is not a new phenomenon. Over twenty years ago, in 1993, Miami Herald research editor Steve Doig won a Pulitzer Prize for his investigative coverage of Hurricane Andrew’s aftermath in Florida. The year prior, Doig and his colleagues spent several intensive months processing and evaluating two data sets—one that helped to map out property damage caused by the hurricane and another documenting wind speeds at different locations and times throughout the storm. “They noticed from using mapping that the damage was much more extensive in certain areas than in others, and then they started trying to figure out why that was, because weather-wise it was the same storm,” Lindgren explained.

What Went Wrong > Miami Herald, December 20, 1992 > Page 1

“What Went Wrong > Miami Herald, December 20, 1992 > Page 1” (originally published Dec. 20, 1992). Flickr photo by Daniel X. O’Neil, licensed under CC BY 2.0

Further investigation unveiled that several different developers had been responsible for real estate construction in different regions. “And it led them to a conclusion and a very powerful piece of journalism showing that it had to do with the building standards of the different developers,” said Lindgren. “So that was one of the early uses of mapping and data journalism, showing what a useful tool it could be.”

As researchers raise questions about the skills and motivations that enable citizen engagement with open data and geospatial technologies, journalism schools are increasingly recognizing the need to integrate a formal understanding of data journalism into the curriculum.

At the 2014 Geothink Annual General Meeting, Lindgren met a fellow researcher with complementary interests—Marcy Burchfield, executive director of the Toronto-based Neptis Foundation. The aim of Neptis has been to apply the unique capabilities of mapping and spatial analysis to help decision makers and the public understand regional issues in the Greater Toronto Area. The Geothink encounter led to the development of a Neptis-led geodata workshop for senior-level students enrolled in Ryerson’s journalism school, exposing students to some statistics basics as well as the various challenges of working with spatial data to develop meaningful stories.

“Getting the data into a usable form, I think, is probably the biggest challenge technically for journalists,” said Lindgren. “Although the skills are rapidly increasing and we’re training our students to do that.”

At Ryerson, undergraduates are required to take an introductory digital journalism course that critically engages with social media and citizen journalism along with new forms of multimedia and alternative storytelling methods. A separate “visualizing facts” elective course aims to provide hands-on experience with various data visualization techniques including mapping, while reinforcing numeracy skills (something that, historically, journalists have not been known for).

Data’s fit for purpose?

CBC News Pledge to Vote Map

CBC News’s crowdsourced, interactive “Pledge to Vote” map, part of their 2015 Canada Votes coverage.

In recent years Canadian data journalists have garnered international attention both for their creative uses of geodata and their involvement in the push for open access to government information. “One of the big problems is the availability of data,” Lindgren said. “What’s available? How good is it? How hard do you have to fight for it? Is it really available through an open data source or do you have to go through Freedom of Information to get it?”

While increasingly media outlets are exploring the possibilities of engaging the public to create crowdsourced content by volunteering their geodata, the data sets that journalists tend to be interested in—ideally, data that can support rich, informative stories relevant to public interest—are not typically collected with the journalist in mind. In particular, government data sources have often been generated to support internal administrative needs, not to address transparency and accountability concerns per se. Data input decisions may not be documented, and agencies may “silently” post-process the information before distributing it to journalists or the greater public. This makes the process of learning how to clean up inconsistent, non-standardized data developed for a very different audience a particularly important skill for journalists to acquire. Only then can a journalist build an understanding of the data’s patterns and the stories they can support.

“You’re only as good as your data,” Lindgren emphasized. “In some ways the act of journalism allows you to test the data and see how good it is. Because the data may be telling you one thing, but then when you go out on the ground and you start interviewing and looking around you may find that what you’re seeing and hearing doesn’t seem to match what the data is telling you.

“So right away, as a journalist you’re going to be suspicious of that. And there are two places where this could be wrong. Either you’re talking to the wrong people or you’re not talking to a broad enough range of people—or there might be something wrong with the data.”

Verifying data accuracy is a time-honoured tradition

Lindgren shared the example of a colleague who was investigating the issue of slum landlords. The reporter asked the municipality to provide data on property standards complaints. Upon receiving and eventually mapping the data, the reporter and his colleagues made a surprising discovery. “They noticed that there was a section of the city that didn’t have any complaints. They thought that was odd, because they knew that there were a lot of rental areas and low-income areas there, with people living in somewhat vulnerable housing situations.”

Ultimately, the dissonance between the story on the ground and the story in the data led the reporter to go back to the city seeking further verification, and the nature of the problem soon revealed itself. It seems that a summer student had been in charge of aggregating and disseminating the data to the journalists when the information was requested, and that student had overlooked one section of the city.

While this particular story reflects human error during the communication phase rather than the data collection phase, Lindgren points out that the strong journalistic traditions of seeking verification and being suspicious of information sources puts the media in a unique position to evaluate data’s quality. “Verification is a fundamental element of journalism. That’s what we do that’s different from anybody who is just commenting out there online. The main issue is: is it verifiable, and what’s the public interest? That’s the starting point.”

Where public and private interests intersect

What constitutes “public interest” is a conversation that still needs to happen. The push for open data and the fact that personal information is increasingly accessible online has led parties both within and beyond government to raise concerns about how to strike the balance between privacy and transparency—and what the right balance may be.  Data sets often contain personal or identifying information. Cleansing the data of that information is not straightforward. Even when data appear on the surface anonymized, there are ever increasing opportunities to combine and process seemingly unrelated data sets in ways that can identify individuals and compromise personal information. As Geothink co-applicant researcher Teresa Scassa has addressed more than once in her work, this is not a theoretical problem but a reality that is already occurring.

Lindgren, however, said she does not see data journalism as giving rise to new types of ethical concerns for the media. “Obviously, a balance has to be struck. But the reality is that oftentimes the data is very generalized. It really depends on what the issue is and what the information is.

“The whole privacy issue is really a red flag, a lot of times, for journalists, because it can be used by governments as a pretext for not releasing information that governments just don’t want the public to know. The two reasons they don’t release information is privacy and violating commercial interests, and then the third reason is political consideration, but they can’t couch it in those terms.”

In terms of how journalists themselves strike that balance, Lindgren said this must be assessed on a case by case basis. “Basically, our job is invading people’s space, quite often. So we have to—and we do—make those judgment calls every day. The data is just another layer of that, or another area where we’d have to think about it and have those discussions.

“What it comes down to is you’re weighing, what’s the public interest in this information? There’s no hard and fast rule. It depends on what the information is.”

If you have any questions for April, reach her on Twitter here: @aprilatryerson

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Geothoughts 7: Unpacking the Current and Future Value of Open Civic Data

Geothink researcher Peter Johnson and his students have been working with government partners across the country to examine the state of civic open data projects in Canada.

Geothink researcher Peter Johnson and his students have been working with government partners across the country to examine the state of civic open data projects in Canada.

By Naomi Bloch

Peter Johnson image

Peter Johnson, assistant professor in the University of Waterloo Department of Geography and Environmental Management, was recently awared Ontario’s Young Researcher Award.

Geothink co-applicant researcher Peter A. Johnson is an assistant professor of Geography and Environmental Management at the University of Waterloo. Johnson and his students have been working with Geothink government partners across the country to examine the state of civic open data projects in Canada. In our latest podcast, he discusses how the seemingly desirable ethos of open data may nonetheless hamper our understanding of how end users are interacting with government products.

In their July article published in Government Information Quarterly, Johnson and Geothink head Renee Sieber discuss what they see as the dominant models—and related challenges—of civic open data today. The authors suggest that these models may carry potentially conflicting motivations. Governments can distribute data and leave it to users to discover and determine data’s value, they may aim to track civic issues in ways that are cost efficient, or they may also try to support market innovation via data provision and the promotion of crowd-sourced contributions. On the other hand, open data efforts also have the potential to enable productive and empowering two-way civic interactions when motivated by non-economic imperatives.

What future directions will government data provision take? That may depend a lot on the choices that government agencies—and end users—make today.

 

If you have thoughts or questions about this podcast, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Reference
Sieber, R. E., & Johnson, P. A. (2015). Civic open data at a crossroads: Dominant models and current challenges, Government Information Quarterly, 32(3), pp. 308-315. doi:10.1016/j.giq.2015.05.003. OR: View pre-print copy.

Abstract
As open data becomes more widely provided by government, it is important to ask questions about the future possibilities and forms that government open data may take. We present four models of open data as they relate to changing relations between citizens and government. These models include; a status quo ‘data over the wall’ form of government data publishing, a form of ‘code exchange’, with government acting as an open data activist, open data as a civic issue tracker, and participatory open data. These models represent multiple end points that can be currently viewed from the unfolding landscape of government open data. We position open data at a crossroads, with significant concerns of the conflicting motivations driving open data, the shifting role of government as a service provider, and the fragile nature of open data within the government space. We emphasize that the future of open data will be driven by the negotiation of the ethical-economic tension that exists between provisioning governments, citizens, and private sector data users.