An analysis of NGOs performance against CHS

My latest post dealt with CHS and its certification scheme, with particular regard to the only organization currently delivering certification services, HQAI.

On HQAI website, the summary reports of the audits conducted are available for consultation, and they are an interesting resource to understand what are the challenges concerning the implementation of the CHS.

The performance of an organization against each commitment is scored with a numeric indicator from 0 to 5, with 0 representing a major non-conformity that prevent issuing a certification, 1 a minor non conformity that needs to be addressed within a defined time span, 2 an observation whereby action is advised, 3 and above means a conformity with the standard. Roughly speaking, scores 1 and 2 mean that policies are in place but are not properly implemented, while 0 mean an absence of internal policy, or a policy that goes against CHS requirements. More details are available in the appendix of each summary reports, and on CHS Alliance website (see in particular the self-assessment tool).

As the score is a numeric indicator, it is relatively easy to get a glimpse of the areas where the more significant issues arise. This is what I will try to do here, with a first descriptive analysis of the scores available on HQAI website.

I refer to the results of the initial assessment, even when a later interim assessment is available, in order to see what are the issues all NGOs are facing before taking measures suggested by HQAI. In particular, I will try to see:

    • What is the most difficult commitment to meet;
    • What is the commitment most successfully met;
    • If the performance against each commitment is uniform;
    • If each organization has a uniform performance against the commitments.

The data were accessed on the 15 March 2018, and they are available at the end of the post in .csv format. Note that for one organization the numeric scores are not available and it was not included in the analysis (hence the data reflect the results of 15 organizations).

In order to have a first look at the data, I have plotted the results of each organization against all the commitments in a radar chart, with a comparison against the average result (in black) (the data were plotted in R with the ggplot2 library, the code is included at the end of the post. Here is where I got the inputs for the code for the radar chart).

radar5

A total of 16 organizations have gone through some form of verification, with 13 opting for the certification initial assessment, 2 for the verification and 1 for benchmarking. Of all of these, only one has met a major non-conformity (against commitment 5, complaint mechanism), whilst 4 have one minor non conformity and one has two. No organization has received a score higher than 4 against any commitment.

With regard to the commitments, the fifth (Complaints are welcomed and addressed) seems to be the one against which most NGOs audited are facing challenges. Across all the organizations, this specific commitment has the lower median (2), average (1.84) and mode (2), and is the only one where an NGO scored a 0. More to the point, all but two organizations register their lower score in this commitment, and only one organization managed to score a 3.

With regard to the commitment against which the NGOs score the best, commitment 6 (Humanitarian response is coordinated and complementary) has the highest median (3.2), average (3.25) and mode (3), and is the one against which most of the organizations get the best score (12 out of 15). It is also the only commitment against which two NGOs managed to get a score of 4, with no organizations getting below 2.6.

More generally, NGOs seem to have quite a uniform performance ranging between score 2 and 3 against all the commitments, with SDs between 0.43 (for commitment 7) and 0.25 (for commitments 1 and 8), with the exception of commitment 5 (lower scores and SD 0.65), while as already noted commitment 6 has higher scores.

In this case a visualization with a boxplot can help to have a better understanding of the distribution of the scores. I invite you to have a look at the interactive version of the graph available here (or selecting the plot).

boxCommitment

In terms of uniformity of the performance of each organization, there is no real regularity, with SDs ranging from 0.9 and 0.29, irrespective of average and mean score.  Also in this case I plotted the results in a boxplot (here for the interactive version, refer to the comment for the organization key on the x axis).

BoxOrganization

This is just a preliminary and sketchy analysis, based on a population that is still limited. However, some tentative conclusions can be drawn:

  • Almost all of the organizations surveyed have in place policies that abide with each of the CHS, although their implementation is not always totally in line with the requirements of the standard;
  • in particular, implementation of complaint mechanisms is a common challenge;
  • while the performance against each commitment is rather regular, the performance of each organization is quite varied.

Comments, ideas and opinions are as usual appreciated!


Data (Entered manually from HQAI website, accessed on date 15 March 2018. As you will see from the code below, a bit of manipulation is required for correct use with ggplot2.)

R Code

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s