Global Integrity tries to be transparent about its work, from our methodologies and data to our sources of funding. We recently finished a survey of our team of field contributors involved in the production of the Global Integrity Report 2011, and we’re happy to share those results publicly. Keeping our field contributors happy and engaged is a major priority for us, so we highly value their feedback. In a short survey, we asked them to rate several aspects of their experience working on the Global Integrity Report 2011, including the methodology, training, staff support, our tools, and their compensation. We received responses from 72 field contributors: 20 lead reporters, 18 lead researchers, 31 country peer reviewers, and 3 regional peer reviewers.
We were proud to learn that respondents generally enjoyed working with us (97%), that GI staff responded quickly (92%), and that they would work with us again (94%). Contributors left us gratifying comments like “I found the process quite inspiring and informative,” and “My relationship with Staff couldn’t have been better.” As many of our contributors go on to seek other governance-related opportunities, we were encouraged by one contributor’s comment that “Global Integrity has enhanced my career.” Another contributor wrote to us, “I have learned a lot working with Global Integrity. As an International Relations graduate student too, I am confident I can handle research concepts, organization techniques and simplicity of report writing.”
Respondents also told us that our specialized fieldwork platform, Indaba, worked well (94%) and was easy to use (94%) – but look for more on the Indaba usage experience soon in a separate post.
Despite such flattering feedback, respondents let us know where they think we can improve. For one, some contributors didn’t think they received clear and complete instructions about our methodology (13%), and several of them gave us suggestions on what we can do to improve the fieldwork process in the future. We’re already working to provide a better explanation of our methodology, and to revise our training materials with clearer, more specific examples. We’re also planning to fine-tune our scoring criteria.
Although most respondents reported understanding what was expected of them when they signed their contract (98%), we also received the perennial critique that compensation was not commensurate with the level of effort required for the project. That is, they wished we could pay more. So do we.
Some of the respondents raised questions about the limits of our indicators. One contributor commented, “Many questions do not apply to my country, they should be omitted when calculating the final scores.” Others told us we should “customize the survey questions to countries,” and that the “one size fits all aspect of the indicators can be frustrating because desired score cannot be correctly reflected.” While we understand those concerns, the nature of the Global Integrity Report approach has always been to evaluate countries on the same set of criteria regardless of their political system, history, or level of economic development. We view the questions we ask through the Global Integrity Report methodology to be universally applicable, based on both theory and our own experience.
We’re especially grateful for our contributors’ feedback – an invaluable piece of the big picture – as we take a step back to reevaluate and conceive of new ways to build on the methodology and data collection of our flagship report.
— Global Integrity