Researchers studying people’s musical preferences have found that psychopaths prefer listening to rap music and, contrary to the movie trope epitomised by Anthony Hopkins’s Hannibal Lecter in The Silence of the Lambs, they are no fonder of classical music than anyone else.
In a study of 200 people who listened to 260 songs, those with the highest psychopath scores were among the greatest fans of the Blackstreet number one hit No Diggity, with Eminem’s Lose Yourself rated highly too.
The New York University team behind the work stress that the results are preliminary and unpublished, but the scientists are sufficiently intrigued to launch a major study in which thousands of people across the psychopathy spectrum will be quizzed on their musical tastes.
Tests on a second group of volunteers suggest the songs could help to predict the disorder. Whatever their other personality traits might be, fans of The Knack’s My Sharona and Sia’s Titanium were among the least psychopathic, the study found.
The researchers have a serious goal in mind: if psychopaths have distinct and robust preferences for songs, their playlists could be used to identify them.
“The media portrays psychopaths as axe murderers and serial killers, but the reality is they are not obvious; they are not like The Joker in Batman. They might be working right next to you, and they blend in. They are like psychological dark matter,” said Pascal Wallisch who led the research.
About 1% of the general population meets the description of a psychopath, but the figure is far higher in prisons, where about one in five has the disorder. One estimate, from Kent Kiehl, a psychologist at the University of New Mexico, suggests that psychopaths cost the US government alone $ 460bn (£340bn) a year.
“You don’t want to have these people in positions where they can cause a lot of harm,” said Wallisch. “We need a tool to identify them without their cooperation or consent.”
Scientists have already found gene variants that are more common in psychopaths, but they are hardly predictive of the disorder. They appear to alter people’s tendencies for empathy and aggression, but they do not determine people’s actions. Brain scans highlight distinct signs too, as the neuroscientist James Fallon discovered when he spotted the patterns of a psychopath in his own brain’s anatomy, but again, these do not set a person’s behaviour. Even if they did, the police cannot search for dangerous individuals by hauling people into brain scanners.
Wallisch recruited volunteers for a study on musical tastes, but realised that many of the participants had separately sat a battery of psychological tests, including one called the Levenson Self-Report Psychopathy Scale, which ranks people’s psychopathic traits. By combining the volunteers’ answers from the music study with their results from the psychopath test, Wallisch identified songs that seemed to be most popular among psychopaths, and others favoured by non-psychopaths.
While No Diggity and Lose Yourself were strikingly popular with psychopaths, other songs had greater predictive power. Wallisch declined to name them out of concern that doing so might compromise any future screening test.
The larger study will now investigate whether the link between musical tastes and psychopathy is real, and if it is, whether groups of songs can predict potential psychopaths. That could lead to some controversial applications, Wallisch said. If the team can identify a group of 30 songs, for example, that together prove good at predicting psychopaths, then playlists from online music providers could be used to identify them.
“The beauty of this idea is you can use it as a screening test without consent, cooperation or maybe even the knowledge of the people involved,” Wallisch said. “The ethics of this are very hairy, but so is having a psychopath as a boss, and so is having a psychopath in any position of power.” Fortunately for ethicists, the possibility is some way off yet. “This work is very preliminary,” Wallisch added. “This is not the end of an investigation, it is the very beginning.”
Kevin Dutton, a psychologist at Oxford, and the author of The Wisdom of Psychopaths, has been gathering data on musical tastes and other preferences for a psychopath study with Channel 4. More than three million people have responded so far, and while online surveys have serious weaknesses, the results so far suggest psychopaths favour rap music over classical and jazz. They also seem more likely to read the Financial Times than other newspapers.
Regardless of its accuracy, Dutton suspects movie directors like the idea of classical music-loving psychopaths because of the “irresistibly alluring” juxtaposition. “The coming together of the dark, visceral, primeval psychopathic mind and the higher aesthetic of classical composition is inherently incongruous, and there is a whole body of literature on the creative potential of incongruity,” he said. “It is the hypnotically captivating and age-old appeal of the ‘beauty and the beast’, only under the same cortical roof.”
Emergency departments risk “grinding to a halt” this winter, say medical leaders. They warn that the number of patients facing long waits for treatment is likely to hit record levels.
Dr Taj Hassan, president of the Royal College of Emergency Medicine, said staff were dangerously overstretched, as NHS figures showed the number of people waiting more than 12 hours for treatment during the coldest months of the year has soared.
From January to March 2012, 15 patients waited for more than 12 hours – in 2017 this figure was 100 times greater, at 1,597.
Last winter was the worst on record for delays, with nearly 200,000 patients waiting for longer than the four-hour target. Hassan said emergency services will be under even greater strain this year, with patients forced to wait longer for basic treatments such as pain relief.
“Winter last year was relatively mild and without a major outbreak of flu. There are indications that the flu vaccine will not be as successful this year and as such we anticipate that conditions will be even more difficult,” said Hassan. Simon Stevens, chief executive of NHS England, has already put hospitals on high alert following major flu outbreaks in Australia and New Zealand, which it is feared may be repeated in the UK.
An extra 5,000 beds are needed to “to get us through what will be a pretty awful winter”, said Hassan. “Over the last five years there has been a continued reduction in bed numbers yet an increase in patients needing to be admitted. As a result, bed occupancy is now at 92% – significantly higher than the safe level of 85% – which is having a knock-on effect on waiting times.”
A lack of funding, especially in social care, and staff shortages are preventing patients from being admitted swiftly and undermining safety, he said. “There is not enough money in the system to get social care packages, patients are delayed in hospital who should be at home, there are not enough acute hospital beds.”
The number of patients waiting for more than 12 hours also increased during the spring months, a time when pressures usually start to ease. From April to June 2017, 311 people waited more than 12 hours for treatment. For the same period in 2012, this was the case for only two patients.
Such figures are likely to underestimate the length of time spent in A&E because they only capture waiting times starting from when a decision to admit is made, not when the patient arrives.
“There can be little doubt that patients are suffering the consequences of this reduction,” said Hassan. “Along with more doctors, we desperately need more beds to stop the system from grinding to a halt.”
A Department of Health spokesperson said A&E departments had received an extra £100m to prepare for winter, in addition to £2bn of social care funding.
The spokesperson added :“This analysis completely overlooks the continued rise in demand on A&Es and the fact that since 2010 hardworking NHS staff are treating 1,800 more patients within four hours each day and are seeing 2.8 million more people each year.”
A snapshot view of NHS and other data on child and adolescent mental health reveals a stark difference along gender lines.
As reported earlier this week, the results of a study by University College London and the University of Liverpool show a discrepancy between the emotional problems perceived by parents and the feelings expressed by their children. Researchers asked parents to report signs of emotional problems in their children at various ages; they also presented the children at age 14 with a series of questions to detect symptoms of depression.
The study reveals that almost a quarter of teenage girls exhibit depressive symptoms. Data from NHS Digital, which examines the proportion of antidepressants prescribed to teenagers between 13 and 17 years old, shows that three-quarters of all antidepressants for this age group are prescribed to girls.
Eating disorders are one of the most common manifestations of mental health problems, and are in some cases closely related to depression. A year-by-year breakdown of hospital admissions for eating disorders indicates that, while eating disorders in both boys and girls are on the rise, more than 90% of teens admitted to the hospital for treatment are girls.
Records also show hospital admissions dating back to 2005 for individuals under 18 years old who committed self-harm. While the numbers for boys have seen a smaller amount of variation with a general upward trend, the figure for girls has climbed sharply during the last decade, with the most significant jump occurring between 2012/13 and 2013/14.
Two of the most common methods of self-harm are poisoning and cutting. Self-poisoning victims are about five times as likely to be girls, and the number of girls hospitalised for cutting themselves has quadrupled over the course of a decade.
Although self-harm, depression, and other mental health problems are more commonly reported and identified in girls, suicide rates are far higher among boys. This data is consistent with research on differences found between men and women in methods used to commit suicide, the influence of alcohol, and other social or cultural factors.
In the UK the Samaritans can be contacted on 116 123. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.
Around 10,000 EU nationals have quit the NHS since the Brexit referendum, it has emerged.
NHS Digital, the agency that collects data on the health service, found that in the 12 months to June, 9,832 EU doctors, nurses and support staff had left, with more believed to have followed in the past three months.
This is an increase of 22% on the previous year and up 42% on two years previously. Among those from the EU who left the NHS between June 2016 and June 2017 were 3,885 nurses and 1,794 doctors.
This is the first time anecdotal evidence of Brexit fallout for the NHS has been quantified. The staff losses will intensify the recruitment problems of the NHS, which is struggling to retain nurses and doctors.
The British Medical Association said the findings mirrored its own research, which found that four in 10 EU doctors were considering leaving, with a further 25% unsure about what to do since the referendum.
“More than a year has passed since the referendum yet the government has failed to produce any detail on what the future holds for EU citizens and their families living in the UK,” said a spokeswoman for the BMA, which represents 170,000 doctors. “Theresa May needs to end the uncertainty and grant EEA [European Economic Area] doctors working in the NHS permanent residence, rather than using them as political pawns in negotiations.”
The Liberal Democrat leader, Vince Cable, called on May to take urgent action to stop a further exodus at the NHS. “Theresa May must make a bold offer to the EU to ringfence negotiations on citizens’ rights and come to a rapid agreement. We are losing thousands of high-quality nurses and doctors from the NHS, driven partly by this government’s heartless approach to the Brexit talks,” he said.
He and others, including Tory MP and former attorney general Dominic Grieve, want the issue of EU citizens ringfenced from the main Brexit talks to help staunch a potential exodus of valuable EU workers from Britain.
“It’s time for the government to take the issue of EU citizens in the UK and British nationals in Europe out of these negotiations,” said Cable.
This year it emerged that 40,000 nursing posts were now vacant in the NHS in England as the service heads for the worst recruitment crisis in its history, according to official new data.
The figures came a week after a German consultant at a Bristol hospital told the Guardian how his department would be “closed down” if its Spanish nurses did not stay. Peter Klepsch, an anaesthetist, told how he and his neuroscientist wife came to the UK 12 years ago and had planned on staying but were now questioning their long-term future here.
“I know a lot of EU doctors and nurses who are saying, ‘I’m not going to stay very much longer,’” Klepsch said during a day of protest organised by the3million grassroots campaign for EU citizens.
The government has been accused of failing to deliver on its promise to guarantee EU citizens’ rights post Brexit and instead using them as a bargaining chip in negotiations.
In May the EU offered to guarantee the rights of all EU citizens affected by the UK’s decision to quit the bloc, including the 1.2 million British nationals already settled or retired on the continent. However the offer was condemned as “pathetic” by the3million. This has led to an impasse in Brexit negotiations, with less than half the listed issues agreed after the third round of talks ended in August.
Tucked away behind York Minster – the grand cathedral adorned with medieval stained-glass windows that dominates the North Yorkshire city’s skyline – is a cobbled street that has become an informal labour exchange. Each day, just before lunch, couriers dressed in the distinctive mint green and black uniform of Deliveroo, the online food delivery company, arrive at the end of this street, park their bikes and scooters next to a bench, and talk among themselves. Clutching their smartphones, they wait for someone, somewhere in the city, to place an order with one of the nearby restaurants and cafes. When an order comes through, one of the couriers will pick it up and deliver it in exchange for a small fee. They will then return to the bench to wait.
Plenty of people in early 21st-century Britain can identify with the experience of working for a company like Deliveroo. Drivers for the taxi firm Uber, for example, know only too well what it’s like for work to arrive in fits and starts via an app. But even more people are employed on zero-hour contracts in a wide variety of jobs, from stacking shelves to waiting tables to caring for the elderly. According to the Office for National Statistics, around 900,000 workers rely on a job with a zero-hour contract. These people start every week not knowing how much work they will get or how much money they will earn.
Informal or casual employment of this kind helps explain why Britain’s unemployment rate has not sky-rocketed since the financial crash of 2008. By contrast, almost a century ago, during the struggles of the 1920s and the Great Depression of the 30s, unemployment regularly climbed above 10%; at the most difficult moments, it went above 20%, with the true level – including those who were out of work but not officially registered as unemployed – even higher. Unemployment was also a serious problem – and one that suffered from the same difficulties of measurement – during the 1980s, when it climbed steadily to more than 12% during the early Thatcher years and, despite a steady decline, ended the decade at almost 7%. Despite the past decade seeing one of the slowest economic recoveries in history, unemployment has not got out of hand for long periods. After peaking at 8.5% in 2011, the rate has recently dropped below 4.5%.
The Conservative-led governments of the past seven years argued that declining unemployment rates are a sign that austerity is working. In the wake of the financial crash, in which banks collapsed and ATMs were hours away from refusing to dispense cash, David Cameron, George Osborne and their colleagues argued that there were too many skivers, sleeping off a life on benefits, while everyone else – the strivers, as they were labelled – trudged to work to support them. Cutting benefits would solve all manner of problems: it would get the skivers back to work, bring public spending down, and be good for the general health of the economy.
Like the unemployment statistics, these claims are deceptive. Millions of people are “just about managing”, to use a phrase the prime minister, Theresa May, was once fond of, and many are faring much worse. In the 12 months before March 2017, the Trussell Trust, Britain’s largest food bank charity, gave out more than 1m three-day emergency food parcels to people in desperate need. At the same time, as the Guardian has reported this week, debt has ballooned in the UK, returning to pre-financial crash levels, with household debt at 150% of income in 2015. This debt has been fuelled by low-to-no wage growth, inflated house prices and, thanks to historically low interest rates, credit made available for items such as cars. But the main issue for the estimated 8.3 million people living with unmanageable debt is needing to borrow money to survive.
According to some commentators, much of this economic insecurity – a major contributor to the discontent that made Vote Leave’s slogan “take back control” so powerful in the EU referendum last year – is rooted in a profound set of changes taking place across western economies. Traditional ways of working and archaic vested interests are being challenged by new and powerful forces. The gig economy epitomised by the likes of Deliveroo and Uber, for example, is often talked about as “disruption”, with digital technology a new and irresistible means of transforming business practices and satisfying their customers. Tremendous entrepreneurial individualism and flexibility is being unleashed, the world just needs to catch up.
The difficulty with these arguments is that we’ve been here before. The sight of workers standing in large groups waiting for work would have been familiar to the residents of British cities such as York more than a century ago. Those workers were painfully aware that irregular and low-paid employment offered few guarantees. While they might be able to obtain enough work each day, week or month, they could be stopped in their tracks at any moment by injury or illness. For all their willingness to work, those casual labourers, like their successors now, might not be able to make ends meet – and while not troubling the unemployment figures, were at constant risk of falling into debt and destitution. Those earlier generations’ answer to their problems, however, was the welfare state – the very thing that successive governments have blamed for the country’s current situation.
Many of the issues at the heart of the current malaise in British politics can be traced back to the late 19th and early 20th centuries, when what we now call the welfare state was slowly being assembled. From a legislative perspective, the welfare state was initially focused on a specific problem that had grown since the early 1800s: that many workers struggled to earn regular and reliable wages throughout the entire year. But these labour market problems were believed to be bound up with other issues: squalor, ignorance, want, idleness and disease – the “five giant evils” William Beveridge identified in his famous report, published in 1942. Schools, hospitals, council houses and benefits for those out of work were just some of the threads woven together to create the tapestry of the modern welfare state. The unravelling of that settlement has seen a resurgence of the original problem governments tried to tackle more than a century ago.
The country had grown wealthy during the industrial revolution, via the financial might of the City of London, the manufacturing power of the north of England, and an enthusiastic embrace of free trade. The poor, however, had not disappeared. The poor law, established in 1601, at the end of Elizabeth I’s reign, made Britain’s guarantee of help for the destitute unique among European nations. In the 1830s, an influential group of reformers, who later would be known as “modernisers”, changed the terms on which that help was offered. Assistance should amount to less than what the lowest-paid labourers could obtain with their wages, reformers insisted. Furthermore, help should only be available to people who were prepared to live in a workhouse – a dark, dank and miserable place where they were given an ill-fitting uniform and forced to carry out menial tasks in exchange for shelter and meagre rations of the most basic food.
The theory was deterrence: make the poor law frightening and only the most desperate – those truly in need – would trouble the authorities and public purse. Yet theory had a difficult relationship with reality. By the closing decades of the 19th century, hundreds of thousands of people – a total close to the population of Liverpool – were still using the poor law every year. The number was so large that many local authorities could not accommodate them in workhouses and had to continue offering cash handouts or food, as had been the case before the 1830s. Who were these people and why were they still asking for help?
Shipping magnate Charles Booth organised a survey of London in 1886, which collected information about what went on behind closed doors in the capital’s slums. Booth then divided London’s population into categories based on their economic means and – somewhat questionably to modern eyes – their habits and behaviour. The result was a shock for his middle-class readers: 30% of London’s population seemed unable or only just about able to meet the basic costs of living.
Booth’s research threw light on dark corners of Britain and implied that the poor were a much bigger group than even the government’s statistics on the number of poor law claimants suggested. Many people wondered if he was right. One was Seebohm Rowntree – a member of a York-based Quaker family that manufactured confectionery and prided themselves on being responsible employers. They tried to know all their workers’ names and introduced welfare schemes, including an eight-hour day. But this approach became more difficult as business boomed and their company grew during the late 19th century. When Booth’s report came out, they worried they were not close enough to their employees to know if Booth’s conclusions applied to them too.
Rowntree and his assistants went out on to the streets of York in 1897 to investigate. Armed with notebooks, they criss-crossed the city, frequently passing the place where Deliveroo couriers would congregate more than a century later. They visited more than 45,000 people in the following two years, asking how much they earned, what they paid in rent, what food they bought, and all manner of other questions about their lives. Rowntree made sure to compile information on wages from local employers and to consult the latest medical research on the number of calories men, women and children needed to consume every day. He used this information to draw a “poverty line” – a calculation of the goods and services an individual needed to survive in modern society and how much money they needed to acquire them – and figured out how many people fell below it.
To Rowntree’s surprise, Booth’s findings applied to York as well as London. But Rowntree did not agree with his static description of the poor. Booth’s classification included numerous sub-divisions and distinctions: those he considered criminal, morally weak and semi-savage were separated from the poor who had not displayed an obvious – and unacceptable – flaw, such as a weakness for drink. Rowntree, however, believed there was much movement between these categories. The poor seemed always to be with us, he explained, but the poor were not always the same people.
Rowntree identified what he called the “poverty cycle”. Many people earned enough money to support themselves, he argued. From time to time, though, their circumstances changed – they got married, had a child or a relative died. These quite ordinary events stretched resources, sometimes for just a few weeks, but often much longer. But when they were over, the pressure on household finances was lifted, meaning people rose above the poverty line. Nevertheless, there was always something around the corner, waiting to drag people back down again; most obviously old age, when all those years of being stretched to the limit and unable to save would take their final toll.
Social reformers and charity workers across the country observed similar patterns of interruptions to, and pressures on, people’s earnings throughout the late 19th and early 20th centuries. One of the most important vantage points was Toynbee Hall, a university settlement located between Whitechapel and Spitalfields in the East End of London, where a small group of Oxford graduates lived among the poor, doing voluntary work and social research, before taking up employment – often of a much more lucrative and prestigious kind – elsewhere.
Among Toynbee Hall’s residents between 1903 and 1905 was William Beveridge. Beveridge spent time in the East End working with the unemployed, observing their daily routines, assisting with schemes that aimed to get them back into work, and following caseworkers from charities. In the process, Beveridge had come to a number of important conclusions. One was that unemployment was “at the root of most other social problems” because society “lays upon its members responsibilities which in the vast majority of cases can be met only from the reward of labour”. The other was that conventional wisdom about the causes of unemployment was wrong. For some commentators, unemployment was a question of character and motivation. An increasingly large part of mainstream opinion certainly accepted that a reality of modern industrial capitalism was periods when there would be no work available for some people – because trades were seasonal, or markets fluctuated. But Beveridge believed even this was a superficial understanding of the issue.
The biggest contribution to unemployment outside the downward slopes of the trade cycle, Beveridge argued, was the inefficiency of industry when it came to hiring workers. He asked readers of his book Unemployment: A Problem of Industry (1909) to imagine a scene he had encountered on many occasions: 10 wharves that each employed between 50 and 100 men per day, half of whom were regular staff and half of whom were reserves. While each wharf would experience similar high and low points in trade throughout the year, they were also likely to have their own individual fluctuations within those patterns. Anyone looking at the 10 wharves as a whole would not see these smaller deviations. The problem was that those smaller deviations were all that mattered to the reserve labourers walking from wharf to wharf asking for work each morning, because they meant the difference between them and their families eating, or going hungry.
If there was better communication and planning, Beveridge argued, almost all of those men would be able to find work each day. The problem was that business and industries were quite happy with the situation: they often had many more workers than vacancies, and did not need to pick up the costs of supporting those who couldn’t find work. Beveridge believed the state was the only institution with both the power to solve this problem and the interest in doing so. The political will to act on this conviction would have far-reaching implications for the millions of people who have found themselves out of work since. But we have slid backwards into a situation where precarious work paid by the hour is considered a sign of progress.
The Liberal party administrations that governed Britain before the first world war changed Britain for ever. They modernised the tax system, differentiating between earned and unearned incomes, and introduced graduated rates for the roughly 3% of the population who qualified to pay income tax. David Lloyd George, the Liberals’ charismatic chancellor of the exchequer, announced a “People’s Budget” in 1909 – one for “raising money to wage implacable warfare against poverty and squalidness”.
The principal aim of the budget was to tackle interruptions to earnings among the working-age population. Following the example of similar schemes in Germany, national insurance involved weekly contributions from three groups: workers, their employers and the state. All wage earners aged 16 to 70 and earning less than £160 a year who paid weekly contributions of four pence a week (three in the case of women) could claim sick pay for up to 26 weeks a year, and treatment from a government-approved doctor. But another aspect of the Liberals’ plans had not been tried anywhere else before: 2.25 million men in a number of trades and industries, such as construction and shipbuilding, where work could be brought to a halt by something as unpredictable as the weather, were to be enrolled in a scheme of compulsory unemployment insurance, which offered benefits of seven shillings a week for up to 15 weeks a year in exchange for contributions of two and a half pence a week.
These schemes had obvious limitations. Pensions were meagre; unemployment insurance was mainly for skilled men; health insurance excluded hospital care, and spouses and children. Almost everyone found something to be unhappy about. The British Medical Association complained about the prospect of doctors being forced to become government employees, while friendly societies, trade unions and private insurance companies thought the state was trying to force them out of business. Middle-class households resented being made to pay out to insure their domestic help. Moreover, Labour MPs complained about the contributory system. What about those who couldn’t pay in or who found themselves out of work for longer than 15 weeks? Why not follow the example of pensions and pay benefits to all out of general taxation?
The Liberal government recognised that national insurance on its own would not tackle interruptions to earnings; interventions into the economy would be required, too. Beveridge was drafted in by Winston Churchill, president of the board of trade, to help roll out a system of labour exchanges – another idea borrowed from Germany. A forerunner of the modern jobcentre, labour exchanges were an important part of the government’s plans for administering national insurance, with employers offered incentives to advertise vacancies in the exchanges and the unemployed asked to visit them to demonstrate they had looked for a job. For Beveridge, this had the potential to create an “organised fluidity of labour” that eliminated the kinds of problems he had observed in east London.
Labour exchanges certainly helped some people find jobs, but they were never the dynamic sites of free-flowing information and recruitment that Beveridge imagined. Pensions and national insurance proved much more successful and durable, though. After the first world war, a succession of governments extended and reformed the schemes in significant ways. The result, building on the centuries-old guarantee of help for the destitute, was an imperfect yet impressive system that offered assistance to many, though far from all, people in times of need.
During the Great Depression of the 1930s, however, when the Labour prime minister Ramsay MacDonald was expelled from his own party after he pushed through a 10% cut to unemployment benefits, there was widespread concern that these schemes were unsustainable. Interruptions to earnings looked like a minor problem when many people feared the complete collapse of the global economic system. What Britain seemed to have, Beveridge later suggested, was a series of “patches” – things that could be sewn on to country’s tearing fabric, rather than solutions to its underlying problems. Perhaps capitalism – a system that treated unskilled workers without the fallback of insurance as dispensable – was the real problem. There had to be a way to manage the economy that would transform life for people in Britain and enable national insurance to offer genuine security to all.
The economist John Maynard Keyneswas always clear about whose side he was on. “I can be influenced by what seems to me to be justice and good sense,” he explained, “but the class war will find me on the side of the educated bourgeoisie.” Although many of his contemporaries threw their lot in with Labour during the interwar years, believing they were the only realistic hope for progressive reform in an era of universal suffrage, Keynes stood firm. He was a Liberal and he intended to do everything he could to help the party – going so far as to help formulate its economic policy under Lloyd George’s leadership during the late 1920s and early 30s. Keynes was not alone. Beveridge might have kept his allegiance quiet in a bid to appear neutral, but in 1944 he won a byelection for the Liberals, and ran the party’s 1945 general election campaign.
Keynes cemented his status as the most important economist of the 20th century during the mid-1930s, when he published The General Theory of Employment, Interest and Money – the book that would serve as the set text for one side of the argument about how governments should respond to downturns and recessions. The world has revisited this argument regularly since the interwar years, including after the financial crash of 2008, when Nobel prize-winning economist Paul Krugman urged governments not to forget the lessons that Keynes had once taught them.
Thrashed out over five years of debate and discussion with his research students and colleagues at Cambridge, The General Theory is now widely known for a relatively small number of ideas and a simple message. Governments should resist the temptation to cut back during recessions, Keynes argued, because their root cause was a contraction in aggregate demand – the total amount of goods and services that are purchased in an economy – which collapsed when people and organisations, uncertain about the future, simultaneously chose to hold on to their money. Keynes explained how spending money had effects that rippled outwards in the economy, including the creation of employment, as the demand for goods and services increased. Governments should stop worrying about deficits and sound finance when times were bad – they could take care of them once everything was moving again.
Unlike Keynes, Beveridge was not invited to help run the economy during the second world war, and was disappointed not to be fully involved. By the summer of 1941, the government had tired of his sniping from the sidelines, and gave him a job they thought would keep him out of sight and mind for some time to come: a review of national insurance. In November the following year, he delivered the result: his ground-breaking report, Social Insurance and Allied Services.
The Beveridge report was not quite what people were expecting, or what people now think it is. The British people did not want a “Santa Claus state” that handed out gifts to everyone, he argued; they wanted a form of economic and social security that reflected a history of paying into the system. Benefits should cover a much wider range of risks, Beveridge explained, but they should be simple to understand: everyone – workers, employers and the state – should pay flat-rate contributions and get flat-rate benefits paid out in return.
On the face of it, there was no reason for the government to be worried. Beveridge’s benefits scheme involved significant extensions and new responsibilities, but it also required people to pay hefty weekly contributions. However, Beveridge had made a number of recommendations, which he modestly called “assumptions”. The first two were an allowance for each child born (after the first) and a National Health Service, free for all people, both of which would be paid for out of general taxation. The other was that the government commit to a new way of running the economy: one in which they made sure that unemployment never went above 8.5%.
Beveridge had called these recommendations assumptions because he believed a reformed system of national insurance could not work without them. Unemployment had to be kept below 8.5% so that people could build up a history of contributions and the system did not collapse under the weight of demand when they needed help. Indeed, the easiest system to administer was one in which authorities could safely assume there were jobs for the vast majority of people.
Initially, the government was not as enthusiastic about committing to his recommendations as the public, who bought an astonishing 100,000 copies of Social Insurance and Allied Services during the month after it was published. The prime minister, Winston Churchill, refused to comment on the report for three months, and offered a vague endorsement when he eventually did. By the end of 1944, however, when victory over Germany looked certain, a series of white papers committed the British state not only to Beveridge’s plan, but also to a number of other new policies, such as a new secondary school system with a leaving age of 15. The roadmap for social reconstruction had been drawn.
The welfare state that came into beingduring the late 1940s underpinned a whole way of life that politicians only started to pull apart from the early 1980s onwards. The intention during the third quarter of the 20th century was to bring capitalism under control, specifically its tendency to interrupt and put downwards pressure on people’s earnings, rather than dispense with the system entirely. The Labour party, which won a historic landslide in the election of July 1945, put its mark on the whole project, in particular by nationalising whole swaths of industry. Yet, after half a century of debate and legislation, each political party had left fingerprints on the final product.
These points matter for a number of reasons. One is that we often assume the welfare state was a collectivist venture. But even strident individualists found reasons to support it. Indeed, the era of social democracy helped create successive generations of individualists, including the working-class people who suddenly found themselves socially mobile during the 1950s, 60s and 70s. Looking back, and quite understandably, those generations can often give in to the temptation to imagine this progress was all down to their own hard work. Yet, as sociologists such as John Goldthorpe have shown, these generations rode the economic and social wave created by the policies adopted after 1945. Economic growth expanded the middle class by creating new management-level jobs into which working-class people could move, in both the public and private sectors, meaning there was “more room at the top”. Moreover, in an era of full employment, home ownership started to rise, not only because new houses were built, but also because it was perfectly reasonable for banks to assume that people would hold down a job for 25 continuous years and therefore pay back any money they borrowed.
Could that strategy be repeated today? The answer that has been given repeatedly for the past decade – and in some cases longer – is no. We have come to see the welfare state simply as a cost to be kept down rather than part of an economic and social strategy that aims to deliver security for all and opportunities to obtain more for those who want to. The idea that these goals are no longer obtainable is clearly false. A good start would be to reconnect with the liberal idea, now more than a century old, that everyone sees returns when they pool risks, whether it’s the individuals who can stop worrying about what is around the corner, governments that might otherwise cut their headline costs but succeed only in shifting it somewhere else, or the companies that benefit from healthy and educated workers operating in a safe environment. A successful economy requires all these actors to understand that they need to give, not just take, in order to build an environment in which they and those that follow them are able to succeed.
Are more radical measures required? In the long term, yes. The world has changed since the early 20th century: businesses and individuals behave differently and the “assumptions”, as Beveridge would have called them, that go with national insurance have evolved. The trend has been to pay for things by pushing the costs on to individuals, as has been done with university tuition fees. But there seems only so much mileage in this approach when debt is reaching dangerous levels, wages are stagnant and, as the economist Thomas Piketty has shown, income generated by wealth has increased rapidly for those lucky enough to have it.
One appropriate response would be to breathe new life into the radical strand of liberalism that differentiated between earned and unearned incomes back at the start of the 20th century. Piketty has argued for a global tax on wealth. But there are domestic policies that would go some way to achieving similar ends. We could consider applying capital gains tax to property – recouping some of the considerable profits that those generations who benefited from the welfare state have acquired from the houses they were able to buy, in part because of it.
Some commentators suggest what seem like even more radical ideas, such as universal basic income (UBI): a guaranteed regular payment for every citizen that would keep them above the poverty line, even if they chose not to work. UBI would deliver security, but faces numerous technical challenges, not least the significant differences in living costs across the country, which make a “universal” sum impossible to settle on, even before tackling the political problems of accusations that it would simply make everyone a benefit claimant. Yet versions of the idea have found support across the political spectrum, from neoliberals such as Milton Friedman to the leftwing economist and one-time Greek finance minister Yanis Varoufakis. For the left, a basic income would give people security and dignity. For the right, however, that basic security would be valuable because it would mean people would be free to take the kind of irregular work offered by the gig economy or zero-hours contracts. The lesson of these differences and convergences of opinion is that tackling economic insecurity need not be done at the expense of efficiency, competitiveness or innovation.
Main image: unemployed labourers waiting for work at a dockyard in March 1931. Photograph by Fox Photos/Getty
Bread for All by Chris Renwick is published by Allen Lane at £20. To buy it for £14 go to bookshop.theguardian.com or call 0330 333 6846. Free UK p&p over £10, online orders only. Phone orders min p&p of £1.99.
• Follow the Long Read on Twitter at @gdnlongread, or sign up to the long read weekly email here.
The 250 people who will start to lose their sight today (Specsavers/RNIB figures) may never know that half of all sight loss is avoidable. National Eye Health Week (18-24 September) aims to address this by promoting regular eye tests for everyone, including those with good vision.
Recent research by the British Ophthalmological Surveillance Unit found that patients are suffering permanent and severe visual loss due to health service delays. Accordingly, until 29 September, an all-party parliamentary group on eye health and visual impairment has asked for evidence from interested parties including patients and families about how best to address the issue of overstretched NHS eye clinics.
The most common cause of blindness in the UK is age-related macular degeneration (AMD), which isolates and handicaps many thousands of sufferers. Yet I, and many others, have had very successful treatment for dry AMD, contrary to recent media reports discrediting the Hubble or EyeMax implants, currently being submitted to Nice for approval. If adopted, this would be the first NHS operation capable of helping all stages of AMD through a fast, painless procedure similar to a normal cataract operation. Elizabeth Lenton (retired GP) Plymouth
• Join the debate – email email@example.com
• Read more Guardian letters – click here to visit gu.com/letters
Australia’s new national cervical cancer screening program has received a boost, with a large clinical trial showing screening for the human papillomavirus is significantly better at detecting potential precancerous cells than the traditional pap smear.
“We found that the HPV test was substantially more effective at picking up high-grade abnormalities compared to the pap test,” said Prof Karen Canfell, director of research at Cancer Council New South Wales.
The results, published in the international journal PLOS Medicine, have been released less than three months before the transition to the new national cervical cancer screening program that eliminates the need for women to have a pap smear every two years.
From 1 December, women from the age of 25, instead of 18, will be required to have a five-yearly HPV test, replacing the two-yearly pap test.
Previous estimates have suggested the new screening program would lower cervical cancer incidence and mortality by at least 20% owing to the more accurate test. But until now it had not been tested among women with a high uptake of the HPV vaccine.
To examine its effectives, researchers at Cancer Council NSW and the Victorian Cytology Service compared the detection rates of high-grade cervical abnormalities with the pap test among 5,000 women aged 25 to 64. The women attended a routine screening at 47 participating clinics in Victoria.
They were randomly assigned to either have a five-yearly HPV screening test or a 2.5-yearly liquid-based cytology screening (pap test).
The study found that compared with the Pap test, HPV screening “significantly” increased detection of high-grade precancerous cervical lesions among those who had been vaccinated. The overall detection rate was 0.1% versus 2.7%, respectively.
“These findings provide initial support for the implementation of primary HPV screening in vaccinated populations,” the authors wrote.
Canfell says increased detection means greater longer-term protection against the development of invasive cancer.
“This adds to existing evidence about how much more accurate and effective HPV screening is,” she said. “We now have a superior method for detecting not just the virus that causes cervical cancer, but also high-grade abnormalities.”
The second phase of the Compass trial is recruiting and hopes to have more than 120,000 participants.
TV viewers will have the opportunity to experience what their sight would be like with various eye conditions on Monday night when Channel 4 airs a unique advert break.
The broadcaster will show five commercials that will have different visual filters applied to them so the viewer can understand how their eyes would be affected by the most common eye conditions in the UK.
The conditions that will be demonstrated are macular degeneration, which affects the central part of a person’s vision; cataracts, which causes cloudy or misty sight; diabetes, which can damage blood vessels at the back of the eyes; hemianopia, where a person loses half of their vision, and glaucoma, which damages the optic nerve.
The adverts will be shown at 9.15pm during a break in The Undateables and will be replayed at 9.30pm with audio description for viewers with a visual impairment.
Channel 4 has put together the unprecedented advert break with the Royal National Institute of Blind People (RNIB) and Eye Health UK. The brands that will feature in the campaign are O2, Paco Rabanne, Amazon Echo, Freeview and Specsavers.
More than two million people in the UK – about one in 33 – are living with conditions that have caused sight loss. The Channel 4 campaign coincides with National Eye Health Week, which starts on Monday.
Sophie Castell, a director at RNIB, said: “This unique opportunity to work with Channel 4 and some really great advertisers will help show viewers different sight loss conditions and what living with sight loss can be like.
“The use of audio description across an entire ad break marks a cultural shift in advertising. We are really proud to be part of this exciting and rewarding initiative with Channel 4 and the advertisers.”
The campaign comes after Channel 4 aired a commercial break during the Paralympics in Rio de Janeiro last year that was fully signed by deaf artist and actor David Ellington.
Jonathan Allan, sales director at Channel 4 said: “Working with RNIB, we aim to illustrate the various perspectives of millions of people in the UK living with sight loss and provide full audio description to all our viewers.
“We hope this latest idea continues Channel 4’s legacy of delivering original, creative campaigns that focus on accessibility. From Superhumans Wanted and Maltesers to last year’s fully signed ad break and the launch of our diversity in advertising award we want to inspire advertisers to develop creative campaigns with their entire audience in mind.”
My father, Maurice Little, who has died aged 75, was a dedicated and skilled paediatrician, a committed volunteer for a number of charities and a loving father and husband.
He was born near Enniskillen, Northern Ireland, during the second world war, to Emily (nee Ross) and John Little, who were dairy farmers. After attending Portora Royal school in Enniskillen Maurice read medicine at Queen’s University, Belfast, graduating in 1965 and then leaving Northern Ireland to work in Liverpool at the Alder Hey hospital.
There he met Lorna Marchesi, a medical student. They married in 1970 and moved to Canada, where Maurice took up a fellowship in paediatric neurology at Kingston General hospital in Ontario. They loved their time in Canada, which included a secondment to an Inuit settlement on Hudson Bay; Maurice was always interested in meeting people from different cultures and was keen to learn how other people thought and lived.
My parents returned to Cardiff in 1973, where Maurice took up a senior registrar post at the University Hospital of Wales before moving to Kent in 1976. There he worked at All Saints hospital in Chatham as a consultant paediatrician. He mentored many people, to whom his advice was often simple, including the mantra: “Always listen to the mother.” Many families owed him a debt of gratitude for his tireless commitment to child health.
As well as numerous teaching and advisory roles and work with the Royal College of Paediatrics, Maurice worked for various charities, including Cancer and Leukaemia in Children, with whom he created a home care service for children with life-threatening illnesses. He also travelled to Romania and Palestine to teach and offer advice on clinical services.
After his retirement he was a member of the independent monitoring board of Rochester prison, in Kent, for more than a decade. He was a keen linguist and attended Irish and Spanish classes for many years.
He and Lorna loved to travel and had friends around the globe. In later years he was a devoted grandfather, never more in his element than when calming a crying baby in his expert hands. We called him “the baby whisperer”.
He is survived by Lorna, his three children, James, Katherine and me, and two grandsons, Billy and Fergus.
Recent reports of the terrifying weather events in the Caribbean and US have exposed us to instances of humans trying to describe a hurricane wind (many would argue that getting mainstream US media to cover the causes rather than the effects of extreme weather would be a great advantage).
The Saffir-Simpson hurricane scale describes the effects on structures and people, but not what it would feel like. Hearing is the sense most frequently invoked, with winds sounding like a train or a low howl.
Clearly wind detection is not one of the five classical senses. It is only recently that the anatomy of wind detection in the fruit fly has been discovered. Wind generates vibrations which are generally processed like touch, but it seems that for insects at least, it’s the hearing circuits that are best placed to decode these signals.
Of course, the most basic response to these events is fear and awe. Our brain generates the fear and that triggers the gut, which feeds back to the brain. The sensation is really part of an indirect loop. If the hurricane struck your body directly that would be a very different story.
Dr Daniel Glaser is director of Science Gallery at King’s College London