What I learnt from looking behind The Global Journal's Top 100 NGO ranking

It is probably fair to say that The Global Journals Top 100 NGO ranking had a bit of a bumpy start. When it launched the first edition in 2012, Dave Algoso’s critical post and editor Jean-Christophe Nothias’ harsh critique quickly dominated the virtual perception in the aid blogosphere. So when the second edition was published in January 2013, vaguely hinting at ‘innovation, impact and sustainability’ as key new criteria to assess NGOs, I was sceptical and mentally preparing for more critical comments. Luckily, the researcher in me won over the potentially ranting aid blogger and I sent out some messages to a variety of organizations featured in the ranking as well as the editorial team asking for more details on process and methodology. I received open and positive feedback all around and one 20 page methodology paper, a couple of email exchanges and a 25 minute phone conversation with one NGO later, a much more nuanced picture had emerged about the ranking, learning processes and the space for discussions the ranking could facilitate further.

The Global Journal did its homework
The background paper ‘Evaluating non-governmental organisations – An overview of The Global Journal’s Top 100 NGOs methodology in 2013’ by researcher Cecilia Cannon currently based at the Graduate Institute of International and Development Studies in Geneva basically consists of two parts. The first part is a more conventional and broader political science analysis about conceptualizing NGOs and evaluation of NGO work. In my academic opinion, this part focuses a bit too much on ‘traditional’ organizations and well-known global civil society debates which does not take the range of innovative organizations into consideration that the actual ranking features. Acumen Fund, Wikimedia Foundation or Root Capital do not really fit into traditional NGO categories and it would be interesting to discuss some of the broader philanthropical changes further.

The second part engages more specifically with the ranking and its criteria and sets the tone on the first page:
By clearly articulating what each criterion is based upon and developing a more focused overall scope of analysis, The Global Journal sought to better address this challenge in 2013. Like any study, there are limitations and room for alterations to the methodology. (p.10)
My main question for this post was how well the methodology corresponds with the aims and objectives of the ranking to ‘showcase diversity’, ‘evaluate NGOs comparatively’, ‘stimulate constructive debate’ and ‘present a range of good NGO practice’. All in all, I believe that the ranking addresses these issues well and with the right mixture of ‘hard facts’ and the necessary openness when addressing contested or hard to define terms like ‘impact’. I will comment more in detail on the process in the next paragraph, taking NGOs perceptions into consideration as well.

Every theme comes with clearly defined sub-categories and a scale which all contribute to a weighted final score. To me, this looks like good practice for establishing a ranking and Cecilia Cannon also identified some ways to improve them further:
Assessing an NGO on an ordinal scale of 1-20, for example, presents the challenge of differentiating between 20 possible scores against criteria that require some level of qualitative judgement. A smaller scale for scoring would enable less scope for subjective inconsistencies. This would improve the quality of future evaluations and would enable independent reviewers to potentially replicate the results when using the same criteria for evaluation.(p.17)
Most importantly at this point is that The Global Journal clearly asked for external advice, is open for discussions and seems willing to fine-tune its methodology further, given that there will never be such a thing as a ‘perfect ranking’.
The view from the NGOs: ‘It felt like a grant application’
I selected a random sample of 11 organizations from across the ranking, controlling for size, ‘brand recognition’ and location.
Until today, I have received 5 responses (2 large international, 1 medium-sized & 2 small NGOs). Again, these were open and friendly exchanges and a picture seems to emerge as to the amount of information that those organizations turned over to The Global Journal:
They requested a fair amount of due diligence information regarding [the organization], similar to what funders frequently request of us, including financials, our impact data, annual reports, etc. (small NGO A)
We were asked to deliver rather detailed information and documentation based on two questionaries. The material included all parts of the operation as well as all annual reports – budgets, staff, donors, concrete operations, strategies, accountability set up etc. (medium-sized NGO)
[T]his year’s nomination e-mail did include a link to a lengthy online questionnaire (I don’t have a copy of this, but it should still be online), and a shorter PDF questionnaire (attached). We politely explained that we did not have time to fill out the questionnaires, and referred them to the answers we provided them for the 2012 edition, and to the fairly extensive information publicly available on our website and in our annual report. We also provided them with a selection of photos that they requested. (large NGO A)
Balancing the needs for thorough information with the time commitments of organization is certainly an issue (‘it was a week worth of work(small NGO B) and one of the biggest challenges seems to be to think about how to treat large global players and small local organizations fairly - even if that means that they may be treated slightly differently with regards to the requirements for information. But more importantly at this point, there were no questions about the rigor of the process and everybody noticed the changes compared to the first ranking process.

‘The ranking really helped us in gaining international legitimacy and more global exposure’
In my long conversation with one small organization I really felt that this is a key aspect of the exercise. For well-known global organizations the inclusion in the ranking is probably just one small add-on to their communications strategy and the impact on funding will probably be small at best. But smaller, innovative organizations really seem to get something valuable out of the exercise which for them was the equivalent of a ‘grant application’ or ‘external evaluation’ in terms of time and staff commitment. After last year
s inclusion, the organization was invited to an international conference and introduced as an ‘expert’ in their area and further networking and publishing opportunities arose as well. Future rankings and research may show how smaller organizations also benefit in more tangible ways if they continue to be among the ‘world’s best’ NGOs.

Personal learning: Looking behind the headline pays off as many things are indeed more complicated in development
I am really glad that I made the effort to reach out to The Global Journal and some of the organizations featured in the ranking. Judged against its self-proclaimed aims, the ranking is delivering on many accounts. If the process continues to be treated seriously and at the same time as an opportunity for discussion and learning for the aid community, Cecilia Cannon’s conclusion that

The Global Journal has made a serious effort to continue to develop and implement a considered and consistent approach. (p.19)
may help to establish the Top 100 NGO ranking as a household resource with an approach that many in the aid industry could agree on after its bumpy start.

Comments

Popular posts from this blog

The worrisome shift to the right of Nordic development cooperation

Should I consider a PhD in International Development Studies?

No Links I Liked 501-Why I am taking a break from my weekly #globaldev content curation

Dear white middle class British women: Please don't send used bras (or anything, really) to Africa

Lords of Poverty (book review)