Bridging Differences in Open Data: Coming up with standards at Open North

Open North has quietly released two reports on open data over the past year.

By Drew Bush

In case you missed either report, over the last year Open North has quietly put out an inventory of open data globally and, in a separate report, recommended baseline international standards for open data catalogs. The first report is entitled Gaps and opportunities for standardization in OGP members’ open data catalogs while the second is entitled Identifying recommended standards and best practicesfor open data.

Their work was completed as part of the Open Government Partnership (OGP) Working Group, a group that aims to support governments seeking transparency through open data. Both reports aim to help the 69 countries in the partnership to improve their ability to share open data by standardizing how it’s made available.

The first report, which inventories open data in OGP’s member countries, notes that most members’ open data initiatives consist largely of open data catalogues. To assess each of these different catalogues, the authors wrote automated scripts to collect, normalize, and analyze them. This process allowed them to set a baseline across countries and identify gaps and opportuni­ties for standardization.

“The analysis simply states the choices that OGP members have made with respect each area for standardization; it makes no judgment as to whether these choices are best practices,” they write in laying out the objectives for the report.

In the second report, the authors address a specific research question: “What baseline standards and best practices for open data should OGP members adopt?” But first they diagnose the problem open data faces globally without any standards.

“The lack of standardization across ju­risdictions is one major barrier; it makes discovering, accessing, using, and integrating data cumbersome and expensive, above the expected return,” they write. “A lack of knowledge about existing standards and a lack of guidance for their adoption and implementation contribute to this situation.”

The majority of the report then seeks to address these problems by outlining baseline standards and best practices for open data catalogs, while taking into account the dif­ferences between jurisdictions that make the global adoption and implementation of standards challenging. In particular, the report concludes with 33 recommendations that member countries should undertake including that governments should provide their agencies a list of acceptable data formats or that they should avoid file compression without good support for it.

To find more of our previous coverage about Open North’s work on open data, check out our previous Geothink.ca story here.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Using Data to Revolutionize How We Make Decisions

Community members taking part in a planning process as part of Robert Goodspeed's doctorl work in Dripping Springs, Texas.

Robert Goodspeed, assistant professor of Urban Planning at the University of Michigan’s Taubman College of Architecture and Urban Planning, examined how decision support systems could be applied to urban planning processes during his doctoral work. This photo is of one such process in Dripping Springs, Texas.

By Drew Bush

The decision support market, a segment of the healthcare industry, made financial headlines when estimation of its global value by 2019 reached USD 239 billion, a jump of almost 38 billion since 2012. According to a new report, major players in the industry have poured money into new technologies that can take advantage of big data.

Digital health initiatives like those led by Canada Health Infoway have led to the creation of a network of systems that securely connect and share health information. Decision Support Systems like this one utilize computer-based data to aid in individual decision-making by supplying a massive bank of previous cases that aid in choosing the most likely answer or predicting trends. Most consist of interactive computer-based systems that utilize data and models to solve problems requiring geographically or temporally dispersed information.

In healthcare, IBM’s Watson system has been leading the trend to improve decisions made by doctors. “Watson knows what tests are relevant to further characterize a particular patient condition and what tests are not,” the report states. “It is a great help to physicians to have an assistant that is able to have read the latest journal articles and is loaded with medical information to recommend what tests may be relevant in a particular situation.”

An estimated 30 percent of all costs incurred for healthcare delivery come from tests that are either of little value in a patient’s case or sometimes outright wrong, according to some reports. Like platforms offered in other industries, the decision support system engineered by IBM offers the promise of more nuanced testing to enable better decisions on which medical tests can be best applied to specific patient conditions.

Robert Goodspeed, assistant professor of Urban Planning at the University of Michigan’s Taubman College of Architecture and Urban Planning, studies decision support systems.

Robert Goodspeed, assistant professor of Urban Planning at the University of Michigan’s Taubman College of Architecture and Urban Planning, studies decision support systems.

Using decision support systems to analyze data and make better decisions has helped to improve processes in many industries. Geothink 2015 Summer Institute Instructor Robert Goodspeed, assistant professor of Urban Planning at the University of Michigan’s Taubman College of Architecture and Urban Planning, has studied this trend.

Although Goodspeed doesn’t work in healthcare, his research examines what he refers to as “planning support systems.” His work has looked at how we can use information technology to improve processes that engage community members in urban planning decisions. During his doctoral work, he created a process that allowed individuals to access information about their neighborhood and city to improve discussions.

This research involved community members placing stickers on maps to categorize specific areas for different land uses. This data was then transferred to digital form with one person entering the data as it was called out. Interactions such as this ensured entering the data could be reviewed by the group as a whole and reflected the ideas that they had discussed.

“The participants reported learning quite a bit and I could observe their plans evolving,” Goodspeed said. “So that’s just one example of the sorts of tools and practices that I think or feel we need. Especially as we’re facing issues like climate change where we want to quantify things and create indicators, and know how the plans we are creating are going to do or how they’ll perform against these different indicators.”

The Varied Uses of Decision/Planning Support Systems

In more recent research, Goodspeed has taken his work with planning support systems and applied it to improve environmental-decision-making processes surrounding North America’s Great Lakes ecosystems. Work he’s done as part of the Great Lakes Aquatic Habitat Framework project have used GIS datasets to examine aquatic habitats such as streams, rivers, and lakes in the region. The process also supplies a “big pile of data” for decision-makers in the fisheries and environmental management departments in Canada and the United States.

Unlike in planning where professional tasks follow a somewhat structured process, ecosystem-based management systems must consider a whole variety of information and tasks, Goodspeed said. Work in the project has included leading participatory design workshops for professionals north and south of the border to aid in the development of a tool that will one day allow easy digital examination off all the information on the Great Lakes collected for the project.

Community participation in planning processes that help to envision the possible future often result in a final product that’s inherently more understandable, Goodspeed added.

“And really it requires that kind of combination of creativity but being specific about what you think will happen and what you think will work,” he said of his work with decision-support systems. As big data is increasingly used to inform decision-making, this trend will only continue to grow beyond the industries of healthcare and environmental planning.

Tweet him @rgoodspeed.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

The Perils, Pitfalls, and Promise of Open Government – a Geothink Interview with Daniel Paré

Untitled

Geothink researcher Daniel Paré examines design-reality gaps in Canadian municipal open government platforms.

By Drew Bush

Earlier this month, Public Sector Digest’s first Open Cities Index ranked Canada’s municipalities according to their openness in supplying municipal data online. The index examined the number of data sets available in three areas of accountability (e.g. elections or budget data), innovation (e.g. traffic volume or service requests), and social policy (e.g. crime rates or health performance) for 34 Canadian cities. Find more details on this index in a previous Geothink.ca story.

But this type of examination represents only one aspect of a city’s openness. Geothink researchers have cautioned that one must consider each city’s goals in making datasets available (as well as tracking how they are used) when assessing the openness of a city. City platforms that utilize open data, sometimes referred to as e-government, are often hailed as a panacea for making government transparent and the political process more open and inclusive. Such pronouncements have accompanied the digitization of government records and data since the 1990s.

Geothink.ca recently sat down with one Geothink researcher to assess the validity of this claim, the downsides of e-government, and to discuss his research on the topic. Daniel Paré is an associate professor in the Department of Communication and School of Information Studies at the University of Ottawa where he also serves as an associate director at the Institute for Science, Society and Policy. His research focuses on the social, economic, political, and technical issues arising from innovations in information and communication technologies in developing and industrialized countries.

Geothink.ca: So tell us a bit about your current recent interests right now, and what you are most excited about in your work.

Paré: What I’m interested in looking at is the points of convergence and divergence between the rhetoric surrounding e-government in the late 1990s early 2000s with much of what we’re hearing about open government data and open data and the promises and perils and pitfalls and such, and sort of contrasting those two. In large part, it’s motivated by the fact that one of the things that I’ve been struck by just sort of informally is it just seems to me that there’s just tremendous parallels almost to the point of sort of repeating the same sort of mantras that we were repeating a little more than 10 years ago what with regards to e-government.

So I want to see the way that that holds. It plays into this whole idea in terms of the myths that are associated with technological change in terms of the liberating potential, the progressive potential, these sort of technological developments. So certainly in the area of open government data, the question becomes, or the issue is sort of, we hear lots of rhetoric about political progress and economic progress and such, and I basically want to suss those things out.

Geothink.ca: What are the differences between citizen-government and client-government interactions and what do you think the transformation toward open government is doing for both of those audiences?

Paré: Well, if we go back to e-government, at the time that e-government came on to the scene, part of the debate was between e-governance and e-government. And a lot of the early discourse and rhetoric around there was focused on the democratic potential. So citizens would be able to now access information much more easily— government information—become more engaged with their government on multiple levels, and, in order to simplify here, everything would become rosy. The underlying assumption being basically that with the ability to have access to information, citizens would seek out that information and would become more engaged in the political domain as a result of that. And almost sort of, in its more extreme cases, [it was] presented almost as sort of linear, ipso facto, done deal.

It was quickly identified, that in many ways e-government wasn’t about e-governance per se, certainly not in the political sense. It was about delivering services more effectively to citizens but in the role of clients essentially. Nothing wrong with that but that’s fundamentally different from political engagement as it’s normally understood. So, yes, it’s fantastic that, yes, I can file all my taxes online, or that we can get information, or that we can renew our licenses, or that we can have access to that information, but that’s more of a client service based implementation and usage than a sort of political domain.

If we jump forward now to the recent years in terms of open government and open data, we have a number of sorts of different discourses that are playing around. Part of it is to say that yes there’s open data and open government—bearing in mind that they’re separate things—that, you know, with access to this information, that fosters greater transparency and hopefully greater transparency [fosters] less corruption, more effective government, etc. The other aspect of that—complementary aspect—is sort of the economic angle saying well if people have access to government information they can harvest this information, they can come up with new sorts of innovations whether that be an app or some sort of other product that gets developed as a result of an analysis of the information that’s now available to them. And this then becomes a means or mechanism for fostering economic growth.

So you have those discourses or those narratives playing out. Now the issue, or one of the many issues, is the fact that realizing those benefits depends on a whole host of factors. And [governments] are dealing here with issues in terms of how do [they] organize and respond to demand, how do [they] organize and respond to supply, and how do [they] organize and respond and try to promote innovation. So you have those sorts of three things playing out. And so to come back to what I mentioned earlier about notions of myths around technologies…we tend to do away with, narratively, with the complexities and ambiguities that are associated with these processes. And so if we say, yes, you know, open data and open governance is a fantastic tool for promoting transparency and enhanced democracy, well possibly, yes, and possibly, no. We need to unravel that. It’s not a done deal. But the myth of that rhetoric is a punchy message. Likewise if we say, yes, well open data and open government is fantastic because it can spur economic growth and all sort of innovations. Fair enough. But again that covers up the challenges and complexities that are associated with that.

Geothink.ca: How does this relate to gaps you are seeing in how platforms are designed for e-government and their actual implementation in terms of how they are used?

Paré: In other work that I’ve done, we do a lot of stuff around the ideas of design-reality gaps. And so the notion there being that, you know, we may design a particular platform with a particular purpose in mind. And it has particular potential but then when we look at the implementation of a particular platform often times it has a host of unintended consequences. There is no guarantee that it will be used in a particular way. And so the opportunities and potentials that were meant to be reaped don’t materialize, right?

In some of those cases that might be linked to the platform itself and in others cases it might be linked to organizational factors. So we can think in terms of a government information system. If we are going to put in a new information system in the government bureaucracy, for example, the assumption is that it will enhance interdepartmental exchanges of communication and information. What that view overlooks is the turf battles between departments and agencies within government. The idea there, in this example, being that it’s not because we have the effective communication system in place that it will actually be used in an effective manner because there are other sociopolitical and cultural factors in that regard.

In the case of open data and open government, we tend to see for example a lot of claims about, sort of, hey, it’s great this information is online people are going to use it. But one of the early challenges that we encountered was, say, well those who can actually use it and do something with it are a very limited and niche segment of the population…The raw data—the raw information that’s there—is in such a form that people don’t know what to do with it or how to manipulate it. So on the one hand, yes, it’s open data the information is there, on the other hand, great, it’s there but what do I do with it if I don’t have the computer savvy or the statistical skills to deal with the information that’s there? So those are those sorts of gaps and complexities that I’m interested in.

Geothink.ca: How does your work relate to Geothink’s research goals and what do you think of the partnership?

Paré: Great question. Geothink relates to this for me in the sense of the open data, open government aspect of it. I had come to this project, Geothink, sort of as an outsider. For me, Geothink, very early on before I knew very much about it was oh, you know, you’re talking about geographical information systems. Which I’ve since learned we’re moving well beyond that. So for me, the issue in terms of Geothink and Geoweb, it fits into issues of open data, open government and clearly the geo part sort of entails a locational element in terms of locational types of data.

Tweet your ideas on this interview to Daniel Paré @DJ_Pare

If you have thoughts or questions about this interview, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Crowdsourcing for better science and governance?

Screen Shot 2016-01-25 at 12.39.53 PM

Cornell University Lab of Ornithology’s E-Bird Web site allows citizen scientists to contribute data on birds for real scientific research as one novel application of crowdsourcing.

By Drew Bush

At Cornell University’s Lab of Ornithology, scientists have long benefited from the legions of enthusiasts who find joy in observing and reporting the birds they see during their daily routines. In 2002, the lab worked with the United States National Audubon Society to launch eBird, an online database where scientists and amateur naturalists can submit real-time observations of the birds they see and their behavior. Since 2013, scientists have benefited from more than 100,000,000 observations and data for over 10,240 species in the program generated by more than 100,000 users.

Often hailed as an application of crowdsourcing that democratizes science by giving citizens the power to contribute, E-Bird is emblematic of a recent trend in applying crowdsourcing to problems outside the for-profit, business sectors where it began. In Canada, a new Community Fishers application allows citizen scientists to collect oceanographic data for Ocean Networks Canada and a number of Canadian cities use PlaceSpeak to collect public opinions on topics related to given locations. In the United States, this trend has led to the introduction of the Crowdsourcing and Citizen Science Act. The bill’s author, Senator Chris Coons of Delaware, wrote in a Wired article this past September that his bill makes explicit “that executive branch agencies, commissions, and all military branches have the explicit authority to make use of crowdsourcing and citizen science projects, utilizing the resourcefulness and innovation of the public to solve problems.”

brabham-daren-500

Geothink researcher Daren Brabham is an assistant professor at the University of Southern California School for Communication and Journalism.

Geothink researcher Daren Brabham, an assistant professor at the University of Southern California School for Communication and Journalism, has long worked on research that supports this development. He is also widely credited as being the first to publish scholarly research that utilizes the word crowdsourcing. (Although he himself notes that one-time Wired editor Jeff Howe actually coined the term in a June 2006 issue where he wrote about Threadless.)

“I’ve got this kind of crazy idea for a citizen science kind of hub, you know call it for lack of a better term a [United States] citizen science corps for instance, or a North American citizen science corps,” Brabham said. “It would be a big program where all these scientific organizations could host—and also museums and researchers at state universities—could host their challenges that they want online communities to go do and solve these problems and gather scientific data in the community or whatever it might be. To post those in one single hub and find a way to gamify that where people can earn badges or level ups or even prizes for donors, or whatever it might be, to get people engaged in helping these organizations.”

Brabham believes crowdsourcing represents not only a tool to help scientists or the government do their work, but an opportunity to redefine what it means to provide service to one’s country—which in the past has been synonymous with military service. He envisions a future where if the United States National Aeronautics Administration (NASA) needs help analyzing water levels or categorizing stars by galaxy, they could involve thousands of citizen scientists in the project much like E-Bird does today.

Today’s trends in crowdsourcing mirror the evolution of his work. In particular, Brabham’s early work focused on applying crowdsourcing to uses in government, non-profits, and in public health. Many of these uses have since become common with, for example, the United States Office of Citizen Services and Innovative Technologies using DigitalGov to provide “information and services to the public anywhere and anytime.” In particular, one of its recent products, The Federal Community of Practice on Crowdsourcing and Citizen Science (CCS), works across the government to share lessons learned and develop best practices for designing, implementing, and evaluating crowdsourcing and citizen science initiatives.

Design Matters

Some of the key concerns for any crowdsourcing initiative, whether it be for urban planning, policy-making, or a for profit venture, is how to build a committed online community, sustain interest in it, and handle dissent amongst its users. Researchers on this subject seek to answer the question of what drives these communities to form and if design issues inhibit or accelerate this process.

For example, it’s difficult to know whether crowdsourced citizen science might succeed best if it involves primary school students in projects that count butterflies or instead utilizes iPhones to measure soil samples. It also helps to figure out why certain types of web-based platforms succeed at engaging communities while others have struggled. Brabham calls this type of assessment work “user-experience design,” which was a particular focus during Geothink’s 2015 Summer Institute.

In work he’s completed with other researchers, Brabham has found platforms that are easy to use, enjoyable, and have an intuitive interface have higher success rates. This may sound obvious but it’s more than just establishing a set of best practices for how all platforms should be designed. Instead, online sites must be organized according to the different tasks users are asked to complete or the different roles they might play.

For example, Brabham often talks about Threadless, a crowdsourced clothing and apparel site, and not just because his early involvement with it set him on his current research path after his now wife suggested he write a paper during his doctoral work applying this approach to sociocultural issues. In particular, he cites how Web sites like it give users a very clear idea of what audience it’s intended for, the activities the site allows users to undertake (shop, submit designs, or rate designs), and also includes a clear, user-friendly interface.

“I think when people are asked to contribute content or ideas or whatever it might be to a web site or organization in a crowdsourced arrangement, they really do care about how easy it is to convey their idea to you and figure out how it’s going to be used,” he said.

He points out that all too often researchers and critics focus on examining bad crowdsourcing initiatives rather than on what makes a given effort work. As crowdsourcing continues to be used in the public realm to help with citizen science efforts, paying attention to the details will become increasingly important. In particular, designers must provide users with multiple entry points, web sites with component parts organized based on tasks, and clear front pages that don’t overwhelm the average person.

A plethora of other issues surround both the implementation of crowdsourcing in public policy or for citizen science, and with its possible future uses. Brabham writes more about recent trends in the use of crowdsourcing in his recently published book: Crowdsourcing in the Public Sector. His earlier book, Crowdsourcing, is often cited for its importance in tracing crowdsourcing’s origins, future applications, and potential research paths.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Four Geothink Partner Cities Named to Top 10 on First Ever Canada Open Cities Index

Rankings of Canada's Top 10 cities out of possible max scores of 193 (Image courtesy of Public Sector Digest).

Rankings of Canada’s Top 10 cities out of possible max scores of 193 in Public Sector Digest’s 2015 Open Cities Index (Image courtesy of Public Sector Digest).

By Drew Bush

Numerous city, state, and provincial governments across North America are finding new ways to share government data online. With more than 60 nations now part of the Open Government Partnership, it’s often difficult to determine which initiatives are simply part of a growing fad instead of being true attempts at more responsive and accountable government.

In the United States, President Barak Obama announced plans in 2009 to make many federal agencies open by default with government information, yet just last month the office charged with carrying out this directive failed to openly publish a schedule for its guidelines on this work. In Canada, a variety of city initiatives aim to allow citizens to more easily view crime statistics, find out information about neighborhood quality of life, or time the arrival of the next bus. With so many initiatives, it can be difficult to determine which best improves municipal responsiveness or offers new services to citizens particularly amidst promises by the newly elected Liberal government on open data (see Tweet below).

The authors of Public Sector Digest’s first ever 2015 Open Cities Index aim to solve this problem by providing “a reference point for the performance” of open data programs in 34 Canadian cities. The authors of the index undertook a survey to measure 107 variables related to open data programs. In particular, the index measures three types of data sets cities may have made available: those related to accountability (e.g. elections or budget data), innovation (e.g. traffic volume or service requests), and social policy (e.g. crime rates or health performance).

Across each data set in these three categories, municipalities were scored on five variables according to questions such as whether their data sets are available online, machine readable, free, and up-to-date. The aim was to help these municipalities, which often have limited resources to spend on open data programs, to assess their strengths and weaknesses and improve open data programs.

Four Geothink partner cities made the top 10 of the index, with Edmonton in first place, Toronto second, Ottawa fourth, and Vancouver sixth. At last year’s Canadian Open Data Summit, Edmonton also won the Canadian Open Data Award. You can find the full list of city rankings on the report’s home page. Yet the value of these types of ratings and awards will only be shown over time, according to many practitioners in the field.

“It’s hard to tell what it means to be ranked fourth because it’s a brand new thing,” said Robert Giggey, the coordinator and lead for the City of Ottawa’s Open Data program. “It’s not something that’s done every year, every month, that everybody knows about and is waiting for. So it’s kind of yet to be determined.”

The Value of the Index

Other indexes have measured open data at the national level, such as the Open Data Barometer. And measurements of municipal open data undertaken by two university students focused only on what types of data sets were available. The Open Cities Index works to take this a step further by engaging with key areas of interest. In particular, the index aims to standardize measurements around three themes:

1. Readiness—To what extent is the municipality ready/capable of fostering positive outcomes through its open data initiative?
2. Implementation—To what extent has the city fulfilled its open data goals and ultimately, what data has it posted online?
3. Impact—To what extent has the posted information been used, what benefits has the city accrued as a result of its open data program, and to what extent is the city capable of measuring the impact?

One Geothink researcher cautions, however, that it’s difficult to ascertain the worth of the index until its authors make the full report available along with more information on the 107 variables surveyed. In particular, he said, implementation can be a difficult metric to measure because different cities have different data collection responsibilities and different goals.

“I’m working on some research right now that shows that governments don’t actually have very good tracking metrics for use,” Peter Johnson, assistant professor in the Department of Geography and Environmental Management at the University of Waterloo, wrote in an e-mail to Geothink.ca. “Much of their sense of who uses open data and what it is used for is anecdotal and certainly incomplete. Since open data is provided with few restrictions, it is difficult to track who is using it and what it is being used for in any comprehensive way.”

Beyond the data online now, cities interested in being included in future years of the index and accessing a detailed analysis of municipal open data programs across Canada must contact Public Sector Digest. Some municipalities, like Ottawa, may wait and see how it goes in those places that have already paid for the service, according to Giggey.

“I want to see what the reaction is from the open data community, from other jurisdictions, from other areas—Geothink—about what they think of the index,” he said. “Is this any good? Is it worth anything? Then we’ll look to see if it’s something we want to invest in.”

A screen shot of Toronto's Open Data portal for city hall.

A screen shot of Toronto’s Open Data portal for city hall.

The Reaction Among Geothink Partner Cities

The value of the index will be determined as more details on its methodology and conclusions are released, and, perhaps, it becomes a regular measure of open data work in Canada’s municipalities. For now city staff in charge of open data work in the cities interviewed by Geothink.ca agree that the index does achieve the goal of bringing recognition to the work they are doing. In Ottawa, this has included work to make the city accountable by providing datasets on elected officials, budget data, lobbyist and employee information, and 311 calls. Toronto got a relatively early start with city budgets in 2009 and now also has a portal with social data on neighborhoods (including datasets like demographics, public health, and crime rates).

“I am glad the index recognizes the time and effort each city puts in to make its data open and accessible for reuse and repurpose,” Linda Low, open data coordinator for the City of Vancouver, wrote in an e-mail. Datasets in her city include information on crime, business licenses, property tax, Orthophoto imagery, and census local areal profiles. “This doesn’t happen overnight and it certainly is a team effort to get to where we are today.”

Edmonton’s recognition for its work derives from a 2010 decision by city leaders to launch an open data catalogue and the 2011 awarding of a $400,000 IBM Smart Cities Challenge award grant. Work in the city has included using advanced analysis of open data streams to enhance crime enforcement and prevention, an “open lab” to provide new products that improve citizen interactions with government, and interactive neighbourhood maps that will help Edmontonians locate and examine waste disposal services, recreational centres, transit information, and capital projects. More can be found on Edmonton’s work in a previous Geothink article.

“We are thrilled and honoured that our innovation and hard work have been recognized,” Yvonne Chen, a strategic planner for the City of Edmonton, wrote in an e-mail. She noted that Edmonton’s success, which results directly from a city council policy on open data, includes having an online budget tool that increases transparency about the allocation of public funds. “Our goal has always been to be a leader in the Canadian open government movement.”

While the recognition helps bring attention to the work being done by cities, much remains to be seen about how well the index actually compares cities against each other when objectives and the types of data recorded can vary greatly.

“It’s great to be in the top 10 any time, but we know from when we got the survey sent to us, we weren’t sure of all their measures that they were taking,” Keith McDonald, open data lead for the City of Toronto, said.

“We’d like to see other studies and maybe a little more apples to apples comparison for sure,” he added. “I think actually that was the intent—I can’t speak for the Public Sector Digest—but I think that was the intent of having an ongoing group that would buy into their measuring, so that people could continue to tweak and make it a stronger real apples to apples comparison. And we would support that.”

In fact, the value of an index like this one may lie in allowing cities to track their own progress over time.

“For all those cities included (and even those that aren’t) it can help to narrow the field as to where effort may be best placed to improve open data provision,” Johnson wrote of what he called a “high-profile external evaluation” of each city’s work.

If you have thoughts or questions about this article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.

Can Citizen Science Help Cities Address Climate Change?

Photo of people taking noise level readings.

Mapping for Change supports citizen science inquiries into environmental and social issues. Here, participants take noise level readings in regions around a London airport. Photo courtesy Mapping for Change.

By Naomi Bloch

If you were following the recent climate change talks in Paris, you may have noticed a recurring theme: policymakers acknowledging the leadership of subnational governments in addressing climate change. Canada’s own delegation to the conference included representatives from the Canadian Federation of Municipalities, as well as provincial and indigenous leaders.

While the 2015 United Nations Climate Change Conference focused on political negotiations, critics have been quick to remind legislators that more efforts are needed to involve citizens in decision-making. It’s hardly a new idea, but how can civic participation function at a global scale? Activities at the local level may hold the key. Municipalities often have established mechanisms to involve the public in deliberative activities. Cities and their citizens can also collaborate on the evidence-gathering needed to make informed decisions.

Geothink collaborator Muki Haklay is the director of the University College London’s Extreme Citizen Science group and a professor of Geographic Information Science in the Department of Civil, Environmental and Geomatic Engineering. In 2008, he co-founded Mapping for Change, an organization that uses participatory mapping and citizen science to address environmental and social issues in cities.

Headshot of Muki Haklay

Muki Haklay, professor of Geographic Information Science in the Department of Civil, Environmental and Geomatic Engineering, University College London.

“I see the value of citizen science as part of wider environmental democracy, going back to the Rio conference in 1992,” Haklay explained in an email interview with Geothink. Principle 10 of the Rio Declaration on Environment and Development states that, at national levels, citizens should have “appropriate access to information concerning the environment that is held by public authorities, including information on hazardous materials and activities.” At the community level, the Declaration calls for active and informed public participation in environmental decision-making processes.

Citizen science invites non-professionals to participate in data gathering and the production of new scientific knowledge. “I see citizen science as a new part of the picture,” said Haklay, “where people also participate in creating environmental information that will influence their lives.”

In Haklay’s view, citizen science has particular benefits that can complement traditional research. “The various changes that have occurred in society and technology mean that we can open environmental decision-making further and make it more inclusive and participatory.” As with all research, appropriate rigor and attention to methodology are required. “Not all data should come from citizen science,” said Haklay. “In terms of data quality, citizen science requires us to use appropriate quality assurance methods.”

Mapping for Change provides some helpful exemplars. One collaboration with local organizations has seen thirty different communities across London measuring and mapping air quality data for their neighbourhoods. “We used a whole range of methods: wipe samples, where we checked for heavy metals in dust on different surfaces; diffusion tubes which measure NO2 levels; and bio-indicators — lichens and leaves,” Haklay said. The project’s findings provide location-specific data that can help alert authorities to potential problem zones. “The local authority responded to the results by promising to do their own monitoring in the area and consider how they can manage the traffic in the area.”

Particularly when expensive equipment or lab analysis is needed, resource limitations can create challenges. However, Haklay points out some unique benefits. “Citizen science provides additional information about the context — local knowledge about the place where the monitoring is taking place,” said Haklay. “Participants can also put equipment in their own homes, which is complex for researchers or government agencies.” The citizen science water study in Flint, Michigan, is a good example of this.

Constraints, of course, are not just funding-related. “Not all people would want to do it, and not everyone will have the skills, though we need to consider how to help people in developing them,” Haklay said. “The limitations are the knowledge that people have, their perception of science and their own capabilities, and the abilities of those who manage citizen science projects to engage at such levels. We shouldn’t expect all scientists to be able to facilitate the whole process on their own.”

Haklay suggests that government agencies looking to incorporate citizen science in their data gathering processes should consult the report, Choosing and Using Citizen Science, produced by the UK’s Centre for Ecology and Hydrology. The report reviews resource and management issues, political issues, as well as scientific issues.

The key to citizen science is that it can involve a range of activities. “Participants can help in setting the research question, create protocols that are suitable to their local culture and needs, analyze the information, participate in the production of reports and papers — in short, in everything,” Haklay said. “The value is in making science more open and more collaborative.”

Interested in learning more about Muki Haklay’s citizen science work? Follow him on Twitter: @mhaklay
If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Citizen Science and Intellectual Property: A Guide for the Perplexed

citizen science image

Citizen scientists help researchers transcribe historical climate records and photograph natural phenomena.

By Naomi Bloch

The concept of the science hobbyist —­ the backyard astronomer staring up at the sky or the amateur ornithologist taking part in the annual Christmas bird count — is hardly a new one. What is notable today, however, is the scale and scope of new collaborations between research institutions and volunteer citizen scientists. These kinds of citizen science partnerships have inspired a new study by Geothink co-applicant Teresa Scassa and doctoral candidate Haewon Chung, called “Managing Intellectual Property Rights in Citizen Science: A Guide for Researchers and Citizen Scientists.”

Chung, now a Geothink Ph.D. student researcher at the University of Ottawa’s Faculty of Law, is interested in intellectual property (IP) law with a particular focus on digital ethics. Scassa is a Canada Research Chair in Information Law at the University of Ottawa. The two researchers will participate in a panel discussion and launch of their new work at the Wilson Center Commons Lab, in Washington, D.C., on December 10.

Teresa Scassa head shot

Teresa Scassa, Canada Research Chair in Information Law at the University of Ottawa

“Part of the point of the guide is to encourage people to take a proactive approach and think about what they want to get out of the citizen science project, and what they need to get out of the citizen science project,” Scassa said. “For example, if you need to publish your research results, or if you need to keep your data confidential because you’ve got private sector funding that requires that, how do you structure the IP side of things so that you can do that?”

Citizen science, broadly construed, involves the participation of non-professional scientists in scientific data gathering and the production of new scientific knowledge. In the realm of Geothink, this generally takes place in the context of volunteered geographic information (VGI) and contributions to geographic knowledge. Some projects may involve volunteers helping with laborious tasks such as transcribing historical climate data. In other cases, participants may be sharing geocoded photos or video footage, recording audio, or producing text narratives. In countries like the U.S. and Canada, such original, creative efforts are inherently protected by copyright law — something participants themselves may not even realize.

Do you copy?

The 80-page report is divided into three parts, beginning with a concise review of relevant areas of intellectual property law including copyright, patent, trademark and trade secret law, as well as specific considerations involved in the protection of traditional knowledge. Though copyright law differs around the world, essentially copyright grants certain exclusive rights to the authors of creative works. These rights usually include the right to control how work is distributed, reproduced, and re-used. “It’s also significant because it arises inadvertently,” said Scassa.

Unlike other types of intellectual property such as patents, in many countries the creator is automatically granted copyright protections without taking any specific legal actions. “There are going to be copyright issues with respect to any website that’s created, and with respect to many different types of contributions that users might make, whether they’re text-based or photographs or video clips or whatever they might be,” Scassa said. “There are copyright issues with respect to compilations of data. And then, of course, those copyright issues are relevant if, for example, the researcher decides to publish in a closed access journal and the participants want access to those research results, and all these sorts of things.”

When institutional researchers initiate citizen science projects, there are commonly expectations regarding eventual publication of findings, data sharing, posting information online, as well as educational and civic aims. “Depending on the nature of the project, the users may expect to have total access to the research results — to any publications, but maybe also to all of the data that’s been gathered,” said Scassa. “So we encourage the researchers who are creating citizen science research projects to think about what the user community may be expecting from them in terms of the project design.”

Ethics and law in the balance

In the second section of the study, the authors explore some of the ethical issues that arise in light of IP law. This includes everything from appropriate attribution to uses of participants’ contributions as well as research output. “If you’re going to be collecting stories or traditional knowledge from a community, for example, then that’s going to result in some intellectual property,” Scassa said. “And the ethical requirements may be different from the bare legal requirements. Part of it is being aware of what the legal defaults are and how those might need to be altered in the context of the relationship that you have with your participants.”

Scassa notes that researchers’ relationship with citizen scientists is generally one among many. “Researchers at universities have a complex web of relationships,” Scassa said. “Their universities have IP policies; those IP policies might provide that all IP stemming from this research may belong to the university and not the researcher, so they may not be able to promise certain things in their projects. Their funders may have expectations, and their publishers may have expectations. They may also have expectations in terms of the ability, perhaps, at some point in the future to patent some of their research. So they have this complex web of relationships and their relationship with citizen scientists is one of those relationships. We encourage them to think about this web of relationships and these expectations and try and design accordingly.”

To help with this process, the third section of the study guides readers through the various types of licensing options that can be applied. The authors provide diverse examples from real-world citizen science projects both local and global, and a toolset to help project designers as well as participants understand their options. “We don’t want to create barriers,” Scassa said. “It’s a really complex area. We’re trying to make it as accessible and as useful as possible, just to try to get people thinking about these ideas.”

The Wilson Center panel discussion, “Legal Issues and Intellectual Property Rights in Citizen Science,” takes place at the Wilson Center Commons Lab, Washington, D.C., Wednesday, Dec. 10, 11am –12:30pm ET. There will be a live webcast of the event.

Interested in learning more about intellectual property law and citizen science? Reach out to Teresa Scassa on Twitter: @TeresaScassa.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

Geothink Newsletter Issue 9

For our last update of 2015, we have a Research Profile of Dr. Claus Rinner (Ryerson University) and his recently graduated Ph.D. student Victoria Fast. The Geothink Newsletter will return in Q1 2016 with more updates on Geothink. Until then, we wish you all a good holiday season.

Claur Rinner research spheres

Download Geothink Newsletter Issue 9

If you have feedback or content for the newsletter, please contact the Editor, Peck Sangiambut