Tag Archives: code

Spotlight on Recent Publications: Open Data and Official Language Regimes

screenshotCanadian  open government website

The bilingual federal Open Government portal

By Naomi Bloch

Teresa Scassa is a Geothink co-applicant researcher and Canada Research Chair in Information Law at the University of Ottawa. In a recently published paper, Scassa and co-author Niki Singh consider some of the challenges that arise for open data initiatives operating in multilingual regions. The authors use Canada’s federal open data initiative as a case study to examine how a government body in an officially bilingual jurisdiction negotiates its language obligations in the execution of its open data plan.

The article points out two areas for potential concern. First, private sector uses of government data could result in the unofficial outsourcing of services that otherwise would be the responsibility of government agencies, “thus directly or indirectly avoiding obligations to provide these services in both official languages.” Second, the authors suggest that the push to rapidly embrace an open data ethos may result in Canada’s minority language communities being left out of open data development and use opportunities.

According to Statistics Canada’s 2011 figures, approximately 7.3 million people — or 22 percent of the population — reported French as their primary language in Canada. This includes over a million residents outside of Quebec, primarily in Ontario and New Brunswick. Canada’s federal agencies are required to serve the public in both English and French. This obligation is formalized within Canada’s Charter of Rights and Freedoms, as well as the Official Languages Act. Government departments are provided with precise guidelines and frameworks  to ensure that they comply with these regulatory requirements in all of their public dealings and communications.

Scassa and Singh reviewed the various components of the federal open data initiative since the launch of the program to determine how well it is observing bilingual requirements. The authors note that while the open data infrastructure as a whole largely adheres to bilingual standards, one departure is the initiative’s Application Programming Interface (API). An API provides a set of protocols and tools for software developers. In this case, the API supports automated calls for open data housed in government databases. According to the authors, “As this open source software is not developed by the federal government, no bilingualism requirements apply to it.” While professional developers may be accustomed to English software environments even if they are francophones, the authors point out that this factor presents an additional barrier for French-language communities who might wish to use open data as a civic tool.

In their analysis of the data portal’s “apps gallery,” Scassa and Singh observed that the majority of apps or data tools posted thus far are provided by government agencies themselves. These offerings are largely bilingual. However, at the time of the authors’ review, only four of the citizen-contributed apps supported French. In general, public contributions to the federal apps gallery are minimal compared to government-produced tools.

As part of their analysis, the authors also looked at the two Canadian Open Data Experience (CODE) hackathon events sponsored by the government in order to promote civic engagement with open data. Communications leading up to the events were provided in English and French. Government documentation also indicated strong participation from Quebec coders at the CODE hackathons, though native language of the coders is not indicated. Interestingly, the authors note, “In spite of the bilingual dimensions of CODE it has produced apps that are for the most part, English only.”

The 2015 event, which was sponsored by government but organized by a private company, had a bilingual website and application process. However, Scassa and Singh found that social media communications surrounding the event itself were primarily in English, including government tweets from the Treasury Board Secretariat. Given this, the authors question whether sufficient effort was made to attract French-Canadian minorities outside of Quebec, and if specific efforts may be needed to gauge and support digital literacy in these minority communities.

While it is still early days for Canada’s open data initiative, this case study serves to highlight the challenges of supporting an open data platform that can meet both legal obligations and broader ethical objectives. The authors conclude that, “In a context where the government is expending resources to encourage the uptake and use of open data in these ways, the allocation of these resources should explicitly identify and address the needs of both official language communities in Canada.”


The open data movement is gathering steam globally, and it has the potential to transform relationships between citizens, the private sector and government. To date, little or no attention has been given to the particular challenge of realizing the benefits of open data within an officially bi- or multi-lingual jurisdiction. Using the efforts and obligations of the Canadian federal government as a case study, the authors identify the challenges posed by developing and implementing an open data agenda within an officially bilingual state. Key concerns include (1) whether open data initiatives might be used as a means to outsource some information analysis and information services to an unregulated private sector, thus directly or indirectly avoiding obligations to provide these services in both official languages; and (2) whether the Canadian government’s embrace of the innovation agenda of open data leaves minority language communities underserved and under-included in the development and use of open data.

Reference: Scassa, T., & Singh, Niki. (2015). Open Data and Official Language Regimes: An Examination of the Canadian Experience. Journal of Democracy & Open Government, 7(1), 117–133.

If you have thoughts or questions about this article, get in touch with Naomi Bloch, Geothink’s digital journalist, at naomi.bloch2@gmail.com.

A First Hand Account of McGill University’s Team-CODE’s Experiences in the 1st Annual ECCE App Challenge hosted by Environmental Systems Research Institute (ESRI)

By Jin Xing

I was one of three Geothink students who competed in the Environmental Systems Research Institute’s (ESRI) ECCE 1st Annual App Challenge hosted by the institute’s Canada Centre of Excellence. Team CODE-McGill, which consisted of McGill University students Matthew Tenney, Carl Hughes, and myself placed second in the competition that concluded on March 20 with the announcement of a winning group from the University of Waterloo.

Although our three team members each has a different research interest, each of us studies topics related to open data. Our Community Open Data Engage (CODE) application was sparked by an exchange I had with Hughes when we discovered we both call Toronto, Ontario home after the competition had already begun. In fact, it was only after Hughes told me that my neighbourhood was “a better” place to live that we began to interrogate the question of how to evaluate a community using open data.

As we worked on our submission, we noticed that community-level open data attracts more attention than data on the whole city. In particular, we found citizens were more concerned with data on traffic, education, and recreation resources in their own neighbourhoods compared to other types of data. Our creation: A new approach for exploring a community using an open data platform that connects people and communities.

However, the application that we designed required a number of trade-offs to be decided in the span of only one week. First, we struggled to choose whether to include more data or to favour an easy-to-use interface. In particular, we wanted to develop functionality to integrate a greater variety of community data but didn’t want the application to become too hard to use. After several hours of discussion, we decided to favour an approach that centered on making open data “easy and ready to use.”

The second trade-off involved the selection of ESRI JavaScript APIs. In particular, we had to choose ESRI ArcGIS API or ESRI Leaflet for open data integration and visualization. At the beginning, I preferred the ArcGIS API due to its rich functions. But Tenney pointed out it was actually over-qualified and may delay the page loading which caused the team to decide to use Leaflet.

Finally, we had to decide how to integrate social media. In particular, we needed to decide whether the Twitter content should be loaded from data streaming or just retrieved from the back-end. All of us felt it was cool to have a real-time Twitter widget on our application’s page, but we didn’t know how to get it to choose the right tweets. For example, a user named Edmonton might say nothing about the City of Edmonton city, and our code would have needed to filter it out in real-time. Considering the difficulty of developing such a data filtering AI in one week, we decided to include it only on the back-end. To accomplish this, we used Python to develop a way to harvest and process data, while the ESRI Leaflet handled all the front-end data integration and visualization.

Our application included data on school locations, health facility locations, grocery store locations, gas station locations, green space, cultural facilities, emergency services, census dissemination areas and Twitter data, all of which were presented as different map layers. We employed the Agile developing method for CODE, meaning we quickly built the prototype for CODE and tested it, then repeated this process with additional functions or by re-developing bad code.

In actuality, though, we built three prototypes in the first two days and spent another two days for testing, selecting and re-developing. The Agile method helped us keep CODE always functional and easy to extend. The only drawback of using Agile was the local code synchronization become necessary before we pushed it to GitHub. If two of us pushed at the same time with different code, our GitHub would be massed up. By late Thursday night, we had nearly finished all the planned coding and had even begun to improve the user experience. The search widget and navigation buttons were added in the last day to make open data easy and ready for use in our CODE application.

We felt that by putting information in the hands of concerned citizens and community leaders, CODE is a proof-of-concept for data-driven approaches to building a strong sense of communityacross cities. CODE also connects people and governments by allowing them to create forums for conversation in specific communities across a city or search social media sites to find other people in their area.

Furthermore, by integrating and visualizing open data at a community scale, CODE demonstrates a new approach for community exploration. In fact, users can search and select different open data for certain communities on one map, and corresponding statistics are shown as pop-ups. In the initial phase, we provided this community exploration service for citizens in Edmonton and Vancouver.

Overall, I felt attending this ECCE App challenge was a great experience to integrate ESRI web technologies with open data research. It proves open data can act as the bridge between citizen and cities, and that ESRI products significantly simplify the building of just such a bridge. We believe more applications will continue to be inspired by the ECCE App challenge and that open data will become closely used in everyday life. Thanks to ESRI, we got a chance to help shape the next-generation of community exploration.

If you have thoughts or questions about this article, get in touch with Jin Xing, Geothink’s Information Technology Specialist, at jin.xing@mail.mcgill.ca.

CODE Hackathon Set to Kick-Off as New Report finds the World’s Governments Slow to Open Governmental Data


A new year for open data? (Photo Credit: Tactical Technology Collective)

By Drew Bush

In the first weeks of the New Year, two important news items for the Geothink audience made headlines. In Toronto, the Canadian federal government got ready to kick-off its second annual multi-city Canadian Open Data Experience (CODE) while the World Wide Web Foundation ranked the United States 2nd and Canada 7th for openness of governmental data in its second annual Open Data Barometer.

Canada Ranked 7th

Canada tied with Norway out of 86 countries surveyed based on whether government data was “open by default” as stipulated in the 2013 G8 Open Data Charter. Of more importance, however, was the country’s positive movement in the rankings and scores from last year, moving one spot up the index.

The survey examines availability of core government data such as company registers, public sector contracts, land titles, how governments spend money and how well public services perform. The U.K. is considered the global leader for open government data, publishing nearly all of these types of data.

Globally, the authors of the report state “there is still a long way to go to put the power of data in the hands of citizens. Core data on how governments are spending our money and how public services are performing remains inaccessible or pay-walled in most countries.”

That’s because fewer than 8 percent of surveyed countries publish datasets on subjects like government budgets, spending and contracts, and on the ownership of companies, in bulk machine-readable formats and under open re-use licenses.

A few key highlights of the report: 1. Only the U.K. and Canada publish land ownership data in open formats and under open licenses; 2. Only the U.K. and the U.S. publish detailed open data on government spending; 3. And, only the U.S., Canada and France publish open data on national environment statistics. Finally, open mapping data is only published in the U.K., the U.S. and Germany (an area where Canada lags).

CODE Hackathon Kicks-Off

In Toronto, developers, graphic designers, students, and anyone interested in trying their hand at coding are getting ready to create innovative uses for the Canadian government’s open data and to win up to $15,000 from the Government of Canada. The 48-hour event is set to begin on February 20th.

Innovations developed at hackathons like this could one day fuel improvements in access to government data. The event attracted 927 developers in 2013 and that number increased to over 1,000, organizers said, the day of the event.

“Open data is a brand new industry,” Ray Sharma, founder of the event and XMG Studios, told CTV News. “We are in an ice berg situation where we’ve only seen the tip of the data that will become available.”

But just what kind of industry is open to debate, as Geothink researchers Peter Johnson and Pamela Robinson examined in a recent paper. Their questions included whether civic hackathons have the potential to replace the traditional ways that government purchases products and services, and whether these events can be considered new vectors for citizen engagement, according to a post Johnson wrote for Geothink.

For more on CODE, you can watch Canada’s President of Treasury Board, Tony Clement here or read more about this year’s event here.

If you have thoughts or questions about the article, get in touch with Drew Bush, Geothink’s digital journalist, at drew.bush@mail.mcgill.ca.