Thursday, October 11, 2007

Grade Inflation Or Claim Inflation

This piece has appeared in the Guardian today, and it is a very interesting study for anyone who likes to compare headlines with content.

The article by Polly Curtis presents the case that an "authoritative new study" presents "evidence of grade inflation" at Russell Group universities which will lead to "accusations of dumbing down" that haven't been levelled at these universities before.

This is all very exciting. Let's look at what's actually happened.

Mantz Yorke, who has been studying this area for some time and knows his stuff, has written a book (which was out in April). In it he presents some data on degrees awarded which shows that the number of 2:1s and Firsts awarded went up between 1994/5 and 2001/2.

So, the data is old.

In the article, Yorke is quoted as saying,
"My evidence suggests that people who attack colleges and new universities for softening or dumbing down are perhaps a little premature"
Which is not quite the same as accusing the Russell Group of grade inflation.

Later in the article, Curtis herself says,
As Yorke himself points out, rising grades do not necessarily indicate 'grade inflation'.
Before going on to explain why. In addition the allegation that this is all rather new will come as news to Tony Mooney, author of this piece from 2003 in the Guardian on, er, Mantz Yorke's research into grade inflation (using, er, the same data as in his book of this year), in which Yorke concludes,
...there is, on present data, little evidence that the percentage of good degrees has been inflated across any of the whole universities whose data have been analysed.

So, let's summarise. An eyecatching piece in the main paper and currently occupying a key spot on the much-visited Guardian website concerns a book that's been out for 6 months, analysing data that's now over 6 years old, that doesn't support the headline or opening paragraphs of the article and rehashes a 4 year old piece from the same newspaper anyway.

And yet, the effect will be, for casual readers, to give the impression that our degree system is deteriorating. It smacks of a journalist playing the age-old game of 'Let's You And Him Fight' justified by the publication, next week, of a review of qualification classifications.
There is a need for a proper, grown-up debate on degrees and their value, but this is not the opening sally into that arena, unfortunately. This isn't a case of educational grade inflation, just a case of journalistic claim inflation.

Technorati tags: , ,

Tuesday, October 09, 2007

Careers Advice Is All Rubbish, Say All

Two important pieces of work out in the last few days, and they both have one thing in common - they criticise careers advice to science students.

The first is the leviathan Sainsbury Review of Government Science and Innovation Policy, as not reported particularly by a Press more concerned that they aren't going to get to play elections this autumn than with the long term viability of the UK as a science and technology innovator.

The second is the a much more specialist report from the Council for Science and Technology, which is reviewing how the situation for young researchers at university has improved since the Roberts Review finally told the world, in 2002, how badly our brightest young people were being treated.
Depressingly, the answer to 'how have things improved' is 'hardly at all'.

Anyway, both reports are authoritative, consult widely, and criticise careers advice and guidance without apparently speaking to anyone involved in producing or disseminating it in HE.

Indeed, neither report really even tries to summarise what is available. This is especially disappointing for the CST's report, because one thing that has definitely improved since Roberts is the standard and availability of careers information and guidance to young researchers. Many universities have dedicated, well-briefed advisers specifically for PhD graduates and postdocs. This is not reflected in the report.

The Sainsbury Review, meanwhile, calls for summaries of graduate populations to be published when most of the relevant data already exists and is available to the general public.

Careers advice is an easy - and fashionable - target. It is a shame that it does not appear that advisers, researchers, or bodies who deal with either at HE level did not get consulted (the Sainsbury Review did, at least, appear to consider Connexions before sticking the boot into HE)

More to come on these reports - hopefully a little more positive.

Technorati tags: , , , ,

Tuesday, September 18, 2007

Education at a Glance 2007

The OECD has produced the annual ‘Education At A Glance’ international comparison report and, as is now traditional, it does an excellent job of trashing a lot of persistent myths about the UK higher education system

- We send too many people to university
No, actually, we don’t. We lag behind a number of countries in terms of university participation, including Australia and New Zealand. Our rate of increase in university participation has slowed down considerably, and a whole suite of countries are expanding more rapidly than us. We’ll get overtaken by all sorts of nations at the current rate.

- We don’t have many science graduates
Actually, we seem to – 1.9% of the employed population aged 25-34 have a science degree or higher compared to an OECD average of 1.3%. 18% of degree holders got science qualifications compared to an OECD average of 11% - only Ireland is higher. (To be fair, we have plenty of biologists, but not many chemists, for example).

- It’s not worth going to university.
I can forgive press misinterpretation of some issues in higher education, but this is the one where I feel they’re guilty of damaging misrepresentation of the issues.
The OECD rather starkly demonstrates just how beneficial going to university is – an earnings advantage, for graduates aged 30 to 44 years, of 61%. It goes up to 77% for a 2:1 or higher. Part of this is because, as the report admits, employment prospects for those who have no upper secondary qualifications are especially poor. The national economy is changing so that those with poor or no skills will soon have very little opportunity.

With that in mind, those who discourage young people to try to improve their own educational level ought to be very careful and should make themselves au fait with the actual situation. This is not data that is amenable to being manipulated by our Government, and indeed there is a fair amount in here to concern it.

Technorati tags: , ,

Monday, July 30, 2007

That Drop-Out Report In "Brief"

Breathless stories about this report from the National Audit Office assessing dropout rates from UK universities, have led people to believe, yet again, that there's something badly wrong with our HE system in the UK.

Yet again, this view is largely the result of people not actually reading what they see fit to comment on.

Briefly, the report looks at the number of non-completers from university between 1999 and 2005, and deduces a probable non-continuation rate of something around 20 per cent for students, with certain factors, such as background, being indicators of likelihood of leaving university.

Cue predictable laments about the decline of the HE system.

Except....

The dropout rate is not really a dropout rate - as the authors themselves say:

There are particular difficulties with data about part-time students due to the inherent flexibilites in patterns of study
Or, in other words it's an overestimate because some part-timers take longer to complete than expected.

The next problem for the doom-mongers is on page 17 - a graph taken from the OECD's Education At A Glance, showing, er, that the UK's dropout rate is actually rather low - worse only than Korea, Greece, Ireland and Japan. Ireland's case is especially interesting - as a country whose HE system has grown very rapidly, and now sees greater participation amongst young people than the UK, the lower dropout rate deals a real blow to the popular theory that trying to send 50 per cent of 18-30 years olds into HE is the cause of this 'high' dropout rate.

The most interesting part of this is the historical context, though. Some reports have made a couple of interesting assumptions - firstly that this dropout rate is high - which it may be, but it's lower than almost everyone else.
The second assumption is that it has gone up. This is actually not supportable. The report starts with 1999 because that's when this data first started being collected in a systematic way. And the dropout rate has actually fallen (albeit in a marginal and probably insignificant way) since then. Before that, well, I am not at all sure that systematic, country-wide university non-completion data is really available. And I have looked.

This begs a number of questions. Firstly, and most importantly, what drop-out rate is acceptable? The report itself notes that you can't get, and shouldn't want, a 100 per cent completion rate because many dropouts are actually quite rational. A student learning a series of particular modules that they needed for a career plan. Someone getting a job that they wanted. Children. Someone just deciding that now is not the best time to go to university after all. Japan has the lowest dropout rate at just under 10 per cent, whilst the US runs at nearly 50 per cent. Perhaps we're actually doing rather well. Perhaps our dropout rate is too low? (I don't personally believe that, by the way, but it's worth considering).

And the second is - were things really better back in the day? I can't find any evidence that they were, and I reserve the right to be very sceptical about anyone who says otherwise without any evidence. And that's something that's happening a great deal at the moment.


Technorati tags: ,,,,,

Saturday, July 14, 2007

How To Get Your Graduate Survey Covered In The Press

Blimey, has it really been that long since I last posted? I am a very bad and lazy blogger (who has been flat out on assessing the postgraduate labour market and having comedy run-ins with a national newspaper, thanks for asking.)

This week has seen a kind of distillation of how the media reports stories on graduate employment, graduate salaries and graduates in general.

Two major surveys came out - the Association of Graduate Recruiters Survey, and the Higher Education Statistics Agency Destinations Of Leavers of Higher Education Survey.

These are all hideously long names, so like everyone else in the field, I'm going to refer to them as the AGR survey and the DHLE survey.

I've covered the AGR survey at length before, so suffice to say, it's not a bad survey as such, it's just not representative of the graduate labour market - just that bit of it that pertains to big, private sector employers, mainly in London and largely associated with the finance industry. It mainly asks what employers aim to pay new graduate recruits this year. It found that the number of vacancies were up, but that the average starting salaries that their employers expected to pay in London might be down this year. It usually covers in the region of 20,000 graduates.

The DHLE meanwhile, is a census survey of everyone who graduated from university in 2006. It asks graduates what they were doing six months after they get their degree. It's a hugely important survey which is the basis of a lot of performance indicators for universities, and a great deal of careers information and so we get a response rate of around 80%. More than 200,000 first degree graduates (it also covers graduates at other levels, but let's set those aside for a moment) replied to the survey.
It found that everything was pretty much as normal. Graduate unemployment levels remain around 6% (it was 6.2% last year - it may have budged by 0.1%, but I don't have the detailed figures yet). Graduate salaries haven't fallen. Graduate underemployment seems, if anything, to have gone down - although, again, I'd prefer to do the calculations myself before I'm sure. As an aside - it's not collected by Government - HESA are an independent charity - and gets played with by all kinds of awkward independent researchers keen to find where things are going wrong. So it's not a cover-up exercise.

So, two surveys, one non-representative, but with a slightly alarmist (although not terribly significant) message, and one which is representative and tells students and graduates that actually, graduate jobs are in reasonable supply and graduate salaries aren't actually falling really.

Guess which one gets all the press coverage, and which one gets largely ignored?

Technorati tags: ,,,,,