Category Archives: Geothink News

Behind the Scenes with Geothink Partner Yvonne Chen: Edmonton’s Open City Initiative

By Drew Bush

Sometime early next week, Edmonton’s City Council will vote to endorse an Open City Initiative that will help cement the city’s status as a smart city alongside cities like Stokholm, Seattle and Vienna. This follows close on the heels of a 2010 decision by city leaders to be the first to launch an open data catalogue and the 2011 awarding of a $400,000 IBM Smart Cities Challenge award grant.

Yvonne Chen, Strategic Planner at City of Edmonton

Yvonne Chen, Strategic Planner at City of Edmonton.

“Edmonton is aspiring to fulfill its role as a permanent global city, which means innovative, inclusive and engaged government,” said Yvonne Chen, a strategic planner for the city who has helped orchestrate the Open City Initiative. “So Open City acts as the umbrella that encompasses all the innovative open government work within the city of Edmonton.”

This work has included using advanced analysis of open data streams to enhance crime enforcement and prevention, an “open lab” to provide new products that improve citizen interactions with government, and interactive neighbourhood maps that will help Edmontonians locate and examine waste disposal services, recreational centres, transit information, and capitol projects.

For Chen, last week’s release of the Open City Initiative represents the culmination of years of work.

“My role throughout this entire release was I was helping with the public consultation sessions, I was analyzing information, and I was writing the Open City initiative documents as well as a lot of the policy itself,” she said.

In fact, the document outlines city policy, action plans for specific initiatives, environmental scans (or reviews), and results from public consultations on city initiatives. More than 1,800 Edmontonians commented on the Initiative last October before it was revised and updated for the Council.

“The City of Edmonton’s Open City Initiative is a municipal perspective of the broader open government philosophy,” Chen adds. “It guides the development of innovative solutions in the effort to connect Edmontonians to city information, programs, services and any engagement opportunities.”

Like many provincial and federal open city policies, the document focuses on making Edmonton’s government more transparent, accountable and accessible. What sets Edmonton apart, Chen said, is a focus on including citizens in the design and delivery of city programs and services through deliberate consultations, presentations, public events and online citizen panels. In fact, more than 2,200 citizens are on just the panel asked questions by city officials as they design the Initiative’s infrastructure.

So what will it mean to live in one of North America’s smart cities? Edmonton is already beginning to provide free public Wi-Fi around the city, developing a not yet ready 311 application for two-way communication about city services, and working to fully integrate its electronic services across city departments.

“So one of the projects which has been reviewed very, very popularly is we have opened public Wi-Fi on the LRT stations,” Chen said of a project that includes plans for expanding the service to all train and tunnels. “A lot of commuters are utilizing the service and utilizing the free Wi-Fi provided by the city while waiting for their trains.”

What do you think about Edmonton’s Open City initiative? Let us know on twitter @geothinkca.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Studying Public Transport Behavior By Accident: Lessons Learned from A Graduate Class in Computer Science with Scott Bell

Saskatchewan's bus lockout lasted one month (Photo courtesy of wn.com).

Saskatchewan’s bus lockout lasted one month (Photo courtesy of wn.com).

By Drew Bush 

Sometimes the best research grows out of collaboration and study within the confines of a classroom. For Scott Bell, an associate professor of Geography and Planning at the University of Saskatchewan, a graduate class in computer science provided just such an opportunity.

The past few years, Bell has collaborated with a colleague in Saskatchewan’s Department of Computer Science, Kevin Stanley, to teach a research-based class that gives students hands-on experience using an Android application for smartphones to measure human behavior at 2-minute intervals. Specifically, the application uses a phone’s accelerometer, GPS, camera, Bluetooth and other sensors to monitor participants’ movements and behaviors.

Last fall, students in the class wanted to examine health beliefs and constructed a survey to administer to participants before the phones were given out. Unrelated to the class’s planned content, the City of Saskatoon locked out its bus drivers, resulting in a month-long transit lockout. The lockout began two weeks before class started.

“We didn’t know what was going to happen during the shutdown,” Bell said. “But during our design stage we included in our survey a series of questions about how the participants get to work, and all of our participants were students at the [University of Saskatchewan]… Do they prefer to take public transit? Are they regular transit users? That kind of thing.”

Afterwards, Bell’s class gave these participants the phones and tracked them for a month—with the lockout ending two weeks into this period. Bell and Stanley’s students could then look at how student movement patterns and behaviors changed during and after a time period in which public transport was unavailable.

“It was interesting,” Bell said. “And one of the main findings, was that there wasn’t a change in attendance. So when the strike, when the lockout was on, everybody was still coming to the university at about the same rate as they came after the lockout ended, so when transit was fully available again.”

However, Bell and Stanley’s students noticed that when the lockout ended student trips to and from school actually became quite a bit shorter. They hypothesized that when their participants were forced to find alternate means of transportation they often relied on friends with private vehicles or on their own car.

This new reliance on private vehicles made additional trips, like running errands, possible on the way home thanks to the flexibility they allow compared to public transit. While not yet confirmed in this research, Bell calls this type of behavior “trip-chaining,” an idea often mentioned in transportation geography. Once the lockout ended and these participants returned to public transport, such behavior ended.

“We did learn in this study that we could use this technique to study transit behavior and we’re thinking a little bit more about that,” Bell said. “About how we can maybe do that with some of the Geothink partners that have more open data policies regarding their transit data and transit use.”

If Bell and Stanley’s students had only examined student attendance rates and arrival times at University of Saskatchewan, then they might have concluded the lockout had little affect on student behavior. However, by using surveys and smartphones, this technique established how stressful it was for students and, perhaps most importantly, how behaviors actually changed on the ground.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Crosspost: Looking at Crowdsourcing’s Big Picture with Daren Brabham

This post is cross-posted with permission from Daren C. Brabham, Ph.D. the personal blog of Daren C. Brabham. Brabham is a Geothink partner at the University of Southern California Annenberg School for Communication and Journalism where he was the first to publish scholarly research using the word “crowdsourcing.”

by Daren C. Brabham

In this post, I provide an overview of crowdsourcing and a way to think about it practically as a problem solving tool that takes on four different forms. I have been refining this definition and typology of crowdsourcing for several years and in conversation with scholars and practitioners from diverse fields. Plenty of people disagree with my characterization of crowdsourcing and many have offered their own typologies for understanding this process, but I submit that this way of thinking about crowdsourcing as a versatile problem solving model still holds up.

I define crowdsourcing as an online, distributed problem solving and production model that leverages online communities to serve organizational goals. Crowdsourcing blends bottom-up, open innovation concepts with top-down, traditional management structures so that organizations can effectively tap the collective intelligence of online communities for specific purposes. Wikis and open source software production are not considered crowdsourcing because there is no sponsoring organization at the top directing the labor of individuals in the online community. And when an organization outsources work to another person–even if that work is digital or technology-focused–that is not considered crowdsourcing either because there is no open opportunity for others to try their hands at that task.

There are four types of crowdsourcing approaches, based on the kinds of problems they solve:

1. The Knowledge Discovery and Management (KDM) crowdsourcingapproach concerns information management problems where the information needed by an organization is located outside the firm, “out there” on the Internet or in daily life. When organizations use a KDM approach to crowdsourcing, they issue a challenge to an online community, which then responds to the challenge by finding and reporting information in a given format back to the organization, for the organization’s benefit. This method is suitable for building collective resources. Many mapping-related activities follow this logic.

2. The Distributed Human Intelligence Tasking (DHIT) crowdsourcing approach concerns information management problems where the organization has the information it needs in-hand but needs that batch of information analyzed or processed by humans. The organization takes the information, decomposes the batch into small “microtasks,” and distributes the tasks to an online community willing to perform the work. This method is ideal for data analysis problems not suitable for efficient processing by computers.

3. The Broadcast Search (BS) crowdsourcingapproach concerns ideation problems that require empirically provable solutions. The organization has a problem it needs solved, opens the problem to an online community in the form of a challenge, and the online community submits possible solutions. The correct solution is a novel approach or design that meets the specifications outlined in the challenge. This method is ideal for scientific problem solving.

4. The Peer-Vetted Creative Production (PVCP) crowdsourcing approach concerns ideation problems where the “correct” solutions are matters of taste, market support, or public opinion. The organization has a problem it needs solved and opens the challenge up to an online community. The online community then submits possible solutions and has a method for choosing the best ideas submitted. This way, the online community is engaged both in the creation and selection of solutions to the problem. This method is ideal for aesthetic, design, or policy-making problems.

This handy decision tree below can help an organization figure out what crowdsourcing approach to take. The first question an organization should ask about their problem solving needs is whether their problem is an information management one or one concerned with ideation, or the generation of new ideas. In the information management direction, the next question to consider is if the challenge is to have an online community go out and find information and assemble it in a common resource (KDM) or if the challenge is to use an online community to process an existing batch of information (DHIT). On the ideation side, the question is whether the resulting solution will be objectively true (BS) or the solution will be one that will be supported through opinion or market support (PVCP).

Untitled1

A decision tree for determining appropriate crowdsourcing approaches for different problems. Source: Brabham, D. C., Ribisl, K. M., Kirchner, T. R., & Bernhardt, J. M. (2014). Crowdsourcing applications for public health. American Journal of Preventive Medicine, 46(2), 179-187.

I hope this conception of crowdsourcing is easy to understand and practically useful. Given this outlook, the big question I always like to ask is how we can mobilize online communities to solve our world’s most pressing problems. What new problems can you think of addressing with the power of crowds?

Daren C. Brabham, Ph.D., is an assistant professor in the Annenberg School for Communication & Journalism at the University of Southern California. He was the first the publish scholarly research using the word “crowdsourcing,” and his work focuses on translating the crowdsourcing model into new applications to serve the public good. He is the author of the books Crowdsourcing (MIT Press, 2013) and Crowdsourcing in the Public Sector (Georgetown University Press, 2015).

Geothink Video Interview 1: Teresa Scassa, University of Ottawa

By Drew Bushfaculty_olympics

This Geothink Video Interview brings us a closeup look at the work and ideas of Teresa Scassa, Canada Research Chair in Information Law at the University of Ottawa. In particular, we talk with her about her views on Canada’s Action Plan for Open Government 2.0, problems with open access under the plan, the idea of making government data open by default and the role of academics (like those in Geothink) in making government more transparent.

Find the interview below. As always, all thoughts and comments are welcome. And, of course, stay tuned for more videos and podcasts soon on Geothink.ca.

If you have thoughts or questions about the video, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

An Emergent Field: Making Better Maps By Applying Geographical Science to the Human Brain

e3abbecd4

Monitoring the human brain. (Photo courtesy of Flickr user Tabsinthe)

By Drew Bush

City planners often use socioeconomic data collected in surveys to determine neighbourhoods that might benefit from improved services. Yet such types of data can have a significant margin of error, especially when they’re collected from a relatively small group.

“Especially if you want to look at a small subset of the population—say, kids under 5 who are living in poverty—the uncertainty level is just huge,” Amy Griffin, a senior lecturer in the School of Physical, Environmental, and Mathematical Sciences at Australia’s University of New South Wales told CityLab’s Laura Bliss last November. “A lot of times, people are making decisions based on highly uncertain census data.”

Her research looks at how different kinds of visualizations can affect decision-making, with an emphasis on understanding the cognitive processes behind the use of maps. Still others in her field are engaged in understanding how the human brain engages with maps in order to improve map-making and the role it plays in municipal governance.

Some in the field study neuroscience from the uniquely geographical perspective of how the human brain reacts to maps with differing standards, visual cues and rules. Sara Irina Fabrikant, head of the Geography Department at the University of Zurich, dedicates her time to understanding how users make inferences from the design elements on a map, and how mapmakers might then design maps that convey data more clearly.

For example, in experiments she conducted to compare a NOAA (National Oceanic and Atmospheric Administration) mass-media weather map versus one of her own design, she found design elements could be used to simplify confusing variables and help users better understand characteristics like wind pressure. Her map included well-defined contour lines for wind pressure and less emphasis on temperature colors.

“Are our rules really good ones?” Fabrikant told CityLab. “If so, how? If they’re not, how can we improve them?”

Amy Lobben, head of the Department of Geography at the University of Oregon, asks a different question. She wants to know what neurological processes are at play as individual brains perform map-related tasks. Her possible goal: to create a map that plays off a given person’s cognitive strengths and weaknesses.

“You could potentially design a map that works with individuals’ innate abilities,” she told CityLab.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Open Data Gets Boost from Obama’s 2016 Budget Proposal

obama-budget-plan

Obama’s 2016 federal budget proposal (Photo courtesy of www.philstar.com).

By Drew Bush

Lost in the details of a $4 trillion budget plan proposed by U.S. President Barack Obama are several provisions that seek to increase public access to government data, strengthen government analysis and collection of data and improve data-driven government decision-making, according to a story first reported in the Federal Times.

Released on February 2, Obama’s proposed 2016 budget for the federal government will likely meet strong resistance from a Republican controlled Congress. However, of more interest to the Geothink audience is the administration’s continuing support for programs which help collect, analyze and share the petabytes of data which the U.S. Government collects each day

“The administration is committed to continuing cost-effective investment in federal statistical programs in order to build and support agencies’ capacity to incorporate evidence and evaluation analyses into budget, management and policy decisions,” the budget reads. “The 2016 budget includes a package of proposals that would make additional administrative data from federal agencies and programs legally and practically available for policy development, program evaluation, performance measurement and accountability and transparency efforts.”

In terms of numbers, the proposed budget would increase funding for statistical programs by 2.5 percent, from $4.2 billion in 2015 to $5.2 billion in 2016. One of the largest shares would go to the U.S. Census Bureau which would receive $10 million to continue building out its collection of datasets and the infrastructure that allows users to collate, analyze and share data.

The funding would also help the federal government acquire state and municipal datasets that could then become accessible to the public. Furthermore, an additional $2 million would raise the General Service Administration’s E-Government initiative to $16 million. This program seeks to “develop and provides direction in the use of Internet-based technologies to make it easier for citizens and businesses to interact with the Federal Government, save taxpayer dollars, and streamline citizen participation.”

The administration has also supported the creation of a legislative commission proposed by Rep. Paul Ryan, R-Wisc., and Sen. Patty Murray, D-Wash., to examine ways to use government data to improve federal operations and support private research and industry.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Tracey P. Lauriault on Citizen Engagement (or lack thereof) with Canada’s Action Plan on Open Government 2.0

tlauriault312

Tracey is a Postdoctoral Researcher in the new field of Critical Data Studies.

By Drew Bush

More than 1,450 individuals collectively generated 2,010 ideas, comments and questions for the Canadian Government on its Action Plan for Open Government 2.0. But one researcher with The Programmable City project who studies open data and open government in Canada feels these numbers miss the real story.

The process leading up to the “What We Heard” report, issued after the completion of consultations from April 24–October 20, 2014, only reflected the enthusiasm of the open data programming community, she says. A broader engagement with civil society organizations that most need help from the government to accomplish their work was severely lacking.

“They might be really good at making an app and taking near real time transit data and coming up with a beautiful app with a fantastic algorithm that will tell you within the millisecond how fast the bus is coming,” Tracy Lauriault, a postdoctoral researcher at the National Institute for Regional and Spatial Analysis (NIRSA), said. “But those aren’t the same people who will sit at a transit committee meeting.”

She believes the government has failed to continue to include important civil society groups in discussions of the plan. Those left out have included community-based organizations, cities having urban planning debates, anti-poverty groups, transportation planning boards and environmental groups. She’s personally tried to include organizations such as the Canadian Council on Social Development or the Federation of Canadian Municipalities only to have their opinions become lost in the process.

“There is I think a sincere wish to collect information from the people who attend but then that’s it,” she said.  “There is no follow up with some people or the comments that are made—or even an assessment, a careful assessment, of who’s in the room and what they’re saying.”

“I’m generally disappointed in what I see in most of these documents,” she added. “When they were delivering or working towards open data back in 2004, 2005 it was really about democratic deliberation and evidenced-informed decision-making—making sure citizens and civil society groups could debate on par with the same types of resources government officials had.”

For it’s part, the government notes that 18 percent of the participants came from civil society groups. But such groups were really just ad-hoc groups who advocate for data or are otherwise involved in aspects of new technology, according to Lauriault. Such input, while useful, is usually limited to requests on datasets, ranking what kind of dataset you’d like to see or choosing what platforms to use to view it, she added.

The report itself notes comments came from the Advisory Panel on Open Government, online forums, in-person sessions, email submissions, Twitter (hash tag #OGAP2), and LinkedIn. In general, participants requested quicker, easier, and more meaningful access to their government, and a desire to be involved in government decision making beyond consultations.

Some suggested that the Government of Canada could go even further toward improving transparency in the extractives sector. For example, proposed legislation to establish mandatory reporting standards could stipulate that extractives data be disclosed in open, machine-readable formats based on a standard template with uniform definitions.

figure5-eng

Major themes to emerge from citizen comments on the “What We Heard Report” (Image courtesy of the Government of Canada).

Find out more about this figure or the “What We Heard” report here.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

CODE Hackathon Set to Kick-Off as New Report finds the World’s Governments Slow to Open Governmental Data

open-data-mortar-20120416-frontpage

A new year for open data? (Photo Credit: Tactical Technology Collective)

By Drew Bush

In the first weeks of the New Year, two important news items for the Geothink audience made headlines. In Toronto, the Canadian federal government got ready to kick-off its second annual multi-city Canadian Open Data Experience (CODE) while the World Wide Web Foundation ranked the United States 2nd and Canada 7th for openness of governmental data in its second annual Open Data Barometer.

Canada Ranked 7th

Canada tied with Norway out of 86 countries surveyed based on whether government data was “open by default” as stipulated in the 2013 G8 Open Data Charter. Of more importance, however, was the country’s positive movement in the rankings and scores from last year, moving one spot up the index.

The survey examines availability of core government data such as company registers, public sector contracts, land titles, how governments spend money and how well public services perform. The U.K. is considered the global leader for open government data, publishing nearly all of these types of data.

Globally, the authors of the report state “there is still a long way to go to put the power of data in the hands of citizens. Core data on how governments are spending our money and how public services are performing remains inaccessible or pay-walled in most countries.”

That’s because fewer than 8 percent of surveyed countries publish datasets on subjects like government budgets, spending and contracts, and on the ownership of companies, in bulk machine-readable formats and under open re-use licenses.

A few key highlights of the report: 1. Only the U.K. and Canada publish land ownership data in open formats and under open licenses; 2. Only the U.K. and the U.S. publish detailed open data on government spending; 3. And, only the U.S., Canada and France publish open data on national environment statistics. Finally, open mapping data is only published in the U.K., the U.S. and Germany (an area where Canada lags).

CODE Hackathon Kicks-Off

In Toronto, developers, graphic designers, students, and anyone interested in trying their hand at coding are getting ready to create innovative uses for the Canadian government’s open data and to win up to $15,000 from the Government of Canada. The 48-hour event is set to begin on February 20th.

Innovations developed at hackathons like this could one day fuel improvements in access to government data. The event attracted 927 developers in 2013 and that number increased to over 1,000, organizers said, the day of the event.

“Open data is a brand new industry,” Ray Sharma, founder of the event and XMG Studios, told CTV News. “We are in an ice berg situation where we’ve only seen the tip of the data that will become available.”

But just what kind of industry is open to debate, as Geothink researchers Peter Johnson and Pamela Robinson examined in a recent paper. Their questions included whether civic hackathons have the potential to replace the traditional ways that government purchases products and services, and whether these events can be considered new vectors for citizen engagement, according to a post Johnson wrote for Geothink.

For more on CODE, you can watch Canada’s President of Treasury Board, Tony Clement here or read more about this year’s event here.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.