The Good Universities Guide results (GUG) have been released for another year, and university marketing departments across the country are scrambling to highlight their four and five-star achievements.
Not that we score many anyway, but I have put a stop to my own university spruiking its four and five-star results over the years.
The GUG is an utterly unreliable and flawed source of performance data which does the sector irreparable harm each year.
I will not acknowledge our high star scores just as I refuse to accept our one-star scores. In fact this year I went a step further and boycotted the release of survey data to the GUG, meaning CQUniversity recorded “not rated” entries for learning and teaching. A case of sour grapes? Maybe. But there are now a handful of highly reputable universities who, just like us, are no longer taking part in the statistical vandalism that is the GUG.
Let me provide just one example of the methodological nonsense peddled by the GUG. Several key measures of graduate experience (overall satisfaction, teaching quality and generic skills) in the GUG are derived from the Australian Graduate Survey (AGS), a national census of recent graduates from degree programs.
The AGS is governed by a Code of Practice, agreed by all Australian universities. The AGS provides raw data for the GUG but, importantly, it is then misused and misrepresented by the GUG.
The AGS data for overall satisfaction, for instance, is based on percentage satisfaction with CQUniversity scoring 81% or so in the last two surveys. In fact, all Australian universities sit within a band from around 77% to 88%. This shows that in general, four out of every five graduates from Australian universities are satisfied with the overall quality of their education – a fantastic vote of confidence in the integrity of our sector. However the GUG takes this data and slices it to exclude all postgraduate and sub-degree students (excluding just over half of our higher education graduates), and then it ranks (not rates) these from highest to lowest, assigning a five-star score to the top quintile (those over 85%) and a one-star score to the lowest quintile - those around 80%. They then confuse their readers by describing these scores as ‘ratings’, when clearly they are rankings.
In the 2015 GUG, CQUniversity was among many good universities scoring just one star for overall satisfaction - it would only take 4-5% more to be five-star, which is neither intuitive nor valid.
This approach flies in the face of common sense - we are all used to true star ratings, for example in hotel ratings on TripAdvisor and those allocated for energy and water efficiency of electrical goods. We know for instance that there is a world of difference between a one-star hotel and a five-star hotel. Yet the GUG doesn’t take this approach - instead it bases its ‘ratings’ on rankings, so there will always be the same number of five-star and one-star universities, irrespective of the widespread high quality of educational outcomes across the sector, and the marginal difference between a five-star and a one star result. Converting the percentage figures to a true five-star scale would actually give all universities satisfaction ratings of between 3.8 to 4.4 stars.
I could go on and on about the GUG’s habitual misuse of data – like how my university sees over 90% of our students excluded from “Starting salary” and “Getting a full time job” categories, leaving us with just two and three star results in previous years (I’ve always scratched my head on that one, given AGS data routinely shows our graduates are above the national average on both fronts). Or how our own real-time in-house student surveying shows vastly different levels of student satisfaction from the five-year lagged data peddled by GUG. But essential it boils down to the GUG unashamedly violating the AGS Code of Practice, which states: ‘the data should be used with impartiality, objectivity and integrity … using methodological sound and transparent methods’, avoiding ‘false, deceptive or misleading ways’ and ‘not… knowingly undermine the reputation and standing of institutions in the Australian sector’.
It may be difficult at times to wear the “not rated” results in this year’s GUG because of our boycott. No doubt it will raise many questions. But I am buoyed by other Universities joining with CQUniversity in our refusal to acknowledge the GUG as a valid and reliable industry assessment tool. In the next few years, the Quality Indicators for Learning and Teaching (QILT) initiative will further strengthen national data, and I predict this will eventually consign GUG’s misuse of stars rankings to the scrap heap.