UK_rail

Passenger Focus, the independent organisation set up by the UK Government to represent the interests of British public transport passengers, published its latest National Passenger Survey (NPS) of the British rail-riding public at the end of January. Its results, based on responses from more than 31,000 passengers across the country, were seen as a boon for a UK rail sector that has been beset by franchise uncertainty and technical issues in the last year.

The Autumn 2012 wave of the NPS, which Passenger Focus carries out twice a year, revealed record overall passenger satisfaction levels of 85%, with no train operating company (TOC) scoring less than 80%. While the survey highlighted individual areas of frustration for passengers – only 47% of passengers were satisfied with value for money, for example – the underlying message for the industry was a positive one.

"Passengers are saying the quality of rail services is improving," said Passenger Focus chief executive Anthony Smith after the report was published. "The combination of increased income from fares, government investment and a clearer focus on performance and dealing with disruption is beginning to pay off."

The Which? passenger survey: a dissenting voice

"The perceived contradiction between these two surveys is ultimately a red herring."

However, less than a month later the NPS’s mostly sunny conclusions were met with a dissenting voice. The second annual train satisfaction survey by consumer watchdog Which?, published in mid-February, made grimmer reading for the industry. The survey of around 7,500 regular rail users found that more than half of the UK’s TOCs had a customer satisfaction rating of 50% or lower, and, in stark contrast to Smith’s upbeat pronouncement, a mere 22% of those surveyed felt that services were improving, despite regular fare increases.

A statement from Which? executive director Richard Lloyd summarised the main passenger concerns raised by the survey: "Passengers tell us they are fed up with trains that are delayed, overcrowded and dirty. This is especially disappointing, as many commuters can’t shop around or change the company they travel with. Train companies need to play fair with their customers, especially when they are being asked to pay more for their journeys."

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

At first glance, it seems an impossible scenario – two passenger satisfaction surveys, published within weeks of each other, recording wildly opposing satisfaction levels for the same group of passengers. If we assume that both surveys can’t be right, there’s an obvious question: is the British public satisfied with its rail services or not?

It’s all in the methodology

On closer inspection, that assumption about the impossibility of both surveys being correct might be premature. These surveys vary drastically in their scope, objectives and, most importantly, their methodology – a point that might have confused casual observers, as methodology isn’t made completely clear in the press releases accompanying them. These differences are the prime factor accounting for the apparent contradiction in the surveys’ outcome.

"The surveys vary drastically in their scope, objectives and, most importantly, their methodology."

By far the biggest difference between the surveys is the scope of questions asked of passengers. The Which? survey asked passengers to rate their train journeys over the last 12 months, based on two key criteria – overall satisfaction with the brand and the likelihood that the respondent would recommend the brand to a friend. The responses to these questions fed into an overall customer score.

The NPS, meanwhile, limited the scope of its questions to the journey that respondents had taken on the day they were surveyed. Passenger Focus’s head of research Ian Wright explains the thought process behind this method: "The objective of our survey was to find out how passengers feel about service delivery and performance, so we sampled a representative set of journeys based on known characteristics of the ‘universe’ of all journeys. If you want to know about general perceptions of the industry and the brands within it, you can ask a representative sample of passengers."

Is one method superior to the other?

With this key difference, it’s easier to understand how the surveys could come up with such contradictory results. The Which? survey’s primary objective was to gauge passengers’ overall perceptions of the TOCs they had been using during the last year, while the NPS is intended to provide much more specific feedback on individual routes, times and other factors to help train companies improve customer experiences in particular problem areas. "Our survey and methodology is different to the Passenger Focus survey and therefore difficult to compare," says a Which? spokesperson.

"Less than a month later, the NPS’s mostly sunny conclusions were met with a dissenting voice."

It would be easy to accuse Passenger Focus of tailoring the reach of its questions to paint a rosier picture of the UK rail landscape, especially given its origin as a government-created organisation (Wright’s brusque response to that, for the record: "Parliament set up Passenger Focus as an independent body – successive governments have respected that"). After all, under the NPS the respondent would rate a single good journey positively, even if the last ten journeys were a disaster.

But it’s impossible to deny the advantages of the NPS’s methodology to support its goal of improving customer service through focused feedback. As well as allowing passengers to judge a journey that they can remember in detail, asking each passenger about that day’s journey helps combat the generally accepted principle that bad experiences have a greater effect on a customer’s perception of a brand than positive ones.

"It is not uncommon for questions about general brand satisfaction to be lower than a measure based on ‘your trip today’," says Wright. "We know that individuals place higher weight on bad experiences than on good, and so a generic satisfaction score takes undue account of any recent bad experiences, which are often more memorable than the numerous journeys where service and performance were satisfactory.

"Therefore we believe the Which? measure is less about satisfaction with performance – like the NPS – and more about perceptions of the brand, which can be influenced by other things including media coverage and advertising."

The detail of feedback sought by Passenger Focus in its survey is supported by the huge numbers of passengers that it covers. According to Wright, tapping the individual experiences of more than 30,000 passengers gives its results a significantly smaller margin of error than the Which? findings.

"The NPS sample size gives us precise scores at a TOC and service group level, as well as measuring change over time with more precision," he says. "For example, the London Overground satisfaction score of 93% from the Autumn 2012 wave of NPS is accurate to within approximately 1.5 percent. The corresponding figure of 65% from the Which? survey is accurate to within approximately 11% – so the ‘true’ score lies somewhere between 54 and 76%. Being able to provide accurate satisfaction scores at a sub-TOC level means we are able to get closer to the individual passenger experience."

Finding common ground

The perceived contradiction between these two surveys is ultimately a red herring. The surveys were conducted with different methods, and the idea that passengers could be happy with individual journeys but dissatisfied with TOC performance during a longer stretch of time doesn’t necessarily contradict itself. And despite their differences, the two surveys do find common ground – the ranking of TOCs is largely similar between the reports, and both identify problems with value for money and commuter services.

The Which? survey’s value lies in recording the long-term impressions of daily commuters and other regular rail users, many of whom clearly feel overcharged and underserved by their operators, while the NPS provides an important service by identifying problem areas with an accuracy and specificity that is arguably unmatched anywhere in the world.

But for casual media observers and the general rail-going public, who are likely to take away little more than the top-line statistics (a BBC news report pointed out the contradiction between the surveys but didn’t attempt to explain it), confusion is the unsurprising result of accepting complex information at face value. And perhaps the responsibility lies with organisations, such as Passenger Focus and Which?, to ensure that the information they present is placed within the proper context, both in the reports themselves and in the press releases that bring them to the public’s attention.


Related content


Just the ticket – will free rail fare data benefit passengers?

The UK’s Association for Train Operating Companies (ATOC) has for the first time made its database of rail fares freely available to website and app developers, as part of a push for governmental data transparency.

A fare deal? Investigating Network Rail’s £37.5bn upgrade package

Network Rail has announced a five-year, £37.5bn investment in UK rail, which includes 355,000 more trains in service and 225 million additional passengers a year.


Follow Chris Lo on Google+