Last Friday we launched the Global Integrity Report: 2011. In generating this year’s report, we began thinking about ways in which we could take the research forward in a novel and inventive way. One option we began to discuss internally was to “open source” the Report by producing in-depth documentation and providing free public access to our web-based, fieldwork platform, Indaba, so that other organizations could generate their own country assessments without our direct involvement.
To help us think through some of the benefits (exciting!) and risks (scary!) of changing our flagship annual report, we pulled together a panel of experts at our launch event hosted by Johns Hopkins-SAIS for a public discussion on the merits and pitfalls of open sourcing the Global Integrity Report. The panel raised a range of interesting issues for us to think about.
For one, quality of the data matters. A lot. It’s clear that whatever changes we make to expand the methods by which we create the Report, we don’t want to jeopardize the quality of the data generated. This leads us to an interesting set of questions: Who will be responsible for doing the quality check – a global community of contributors (many of whom we may not know) or Global Integrity staff? How does one develop a community that cares about the data and self-polices the content? Does a self-moderated community strip neutrality and credibility from the data?
Another idea we’re still mulling over is comparability. That is, do people want the ability to compare results across countries? Based on the different perspectives offered by the panelists, the answer’s not clear. In the decade since we’ve been doing the report, we changed our mind on comparing countries, too.
One final idea we’ll throw out (although many others were discussed) is how to manage risks when our methodology forks, as moderator Alex Howard assured us is bound to happen (in plain terms, a community-led approach to the assessments would lead to new and different questions being asked in different country). Some suggested we build in version control or maintain a parallel classic Global Integrity Report in addition to whatever new cutlery grows out of our core methodology.
As an organization, Global Integrity will be exploring these issues – and the bigger question of how to evolve the Global Integrity Report – in the upcoming months. But we need your help to do it: we welcome your thoughts and critiques in the comments. Let us know if there is something about the Global Integrity Report that you can’t live without, or if we’re kicking around some ideas that you know would never work. We look forward to sharing our thoughts as soon as we have them.
If you missed the event in person, you can watch the video from the live stream here.
–Carrie Golden
–Image: Carrie Golden