And to think, it only took them 3 or 4 years of 1-star organizational efficiency ratings to finally learn how to game the Charity Navigator system so that things looked better on the books.
Leave it to Ryan Kaldari to conveniently not mention the fact that despite any reforms or improvements in how they categorize their cash, the WMF still
spends on program expenses only 46% of its revenues
. In other words, "We need your money, everyone! (But not really.)"
And, of course, very little examination of exactly what Charity Navigator just did -- they reconfigured their rating system so dramatically, "Using the expanded and more in-depth rating system, fully half the charities evaluated (more than 2,700) received new star ratings.
" Does anyone else wonder what sort of changes would cause 50% of the "old" ratings to now be wrong?
You can easily see why the new ratings system has so dramatically altered the scores -- a new-found emphasis on "transparency" issues (WMF's strong suit, unless you're asking how much they paid Q2 Consulting for the no-bid research contract) can severely punish organizations that are more "closed". See this one
, for example, which despite financials about on par with the WMF, now gets only 1 star out of 4 because of all the red X's in their accountability section. On the flip side, there's no explanation for why this charity
jumped from 1 star overall to 4 stars overall.
It seems to me that we're wise to keep focusing on the federal Form 990, as the WMF can't really cheat on that one (unless you're trying to disguise that 60% of early boards were composed of business partners). The Form 990 numbers continue to clearly show that program expenses receive less than half the money that's given to the WMF by donors, while other more reputable charities manage to deliver upwards of 85% of revenues to the causes they uphold.This post has been edited by thekohser: Thu 22nd September 2011, 11:15am