Category Archives: Cancer

Hospital bosses forced to chant ‘we can do this’ over A&E targets

Hospital bosses were forced to chant “we can do this” by a senior NHS official in an effort to improve their accident and emergency performance in advance of what doctors have warned will be a tough winter for the NHS.

Hospital trust chief executives say they were left feeling “bullied, patronised and humiliated” by the incident last week at a meeting attended by Jeremy Hunt, the health secretary, and Simon Stevens, the head of the NHS in England.

The leaders of about 60 trusts which NHS national bodies deemed to have the worst record on meeting the politically important four-hour A&E treatment target were called into a meeting held in London on Monday 18 September.

Chief executives present say that they were divided into four regional groups, covering the south and north of England, London, and the Midlands and east of the country, each of which held a separate session with a senior NHS England official.

Paul Watson, NHS England’s regional director for the Midlands and east of England, then encouraged those in the group he was leading to chant “we can do it” as part of a renewed effort to improve their A&E performance. Hunt and Stevens are not thought to have been at that session; nor was Jim Mackie, chief executive of health service regulator NHS Improvement, who jointly convened the meeting with Hunt and Stevens.

One chief executive said: “It was awful – the worst meeting I’ve been at in my entire career. Watson said: ‘Do you want the 40-slide version of our message or the four-word version?’ Everyone wanted the four-word version, obviously.

“He then said ‘I want you to all chant ‘we…can…do…this’. It was awful, patronising and unhelpful, and came straight after the whole group had just been shouted at over A&E target performance and told that we were all failing and putting patient safety at risk.”

According to the Health Service Journal, which revealed what had happened at the meeting, Watson told trust bosses that they were initially chanting too quietly and that they should chant the slogan again but louder, and “take the roof off” with the noise.

Watson’s use of the tactic has prompted complaints from within the NHS that the chanting was “Bob the Builder for NHS leaders”, after the children’s TV character Bob the Builder with his “Can we fix this? Yes we can” catchphrase. Another HSJ reader posted a comment on its website saying: “More akin to North Korea than the NHS”.

Anger and ridicule directed at Watson have prompted him to apologise for and explain his behaviour in messages he posted on the HSJ website since it published the story.

“If anyone found my session on Monday inappropriate in any way then I can only apologise – it was meant as light relief rather than brainwashing,” said Watson.

“As I said at Monday’s event, this can be done. If that seems cheesy or patronising then so be it but it does have the merit of being true – Paul”, he added.

He also repeated his claim that inadequate A&E performance endangered patients’ safety.

“It’s good to let off steam but let’s remember what’s at stake here: 1 Urgent care is the most basic service the NHS provides; 2 A badly run, crowded ED [emergency department] is a miserable experience for our patients; 3 These patients are often frail, elderly and frightened as well as very ill; 4 A crowded ED can be dangerous.”

If other trusts could provide excellent A&E services despite the rising demand for care, why could the 60 represented at the meeting not do that, he asked. He also angered trust bosses by saying that “the biggest single determinant of whether a struggling service is turned round is the confidence, optimism and determination of local leadership to do this and follow it through”.

The Guardian has approached NHS England and the Department of Health for comment.

Hospital bosses forced to chant ‘we can do this’ over A&E targets

Hospital bosses were forced to chant “we can do this” by a senior NHS official in an effort to improve their accident and emergency performance in advance of what doctors have warned will be a tough winter for the NHS.

Hospital trust chief executives say they were left feeling “bullied, patronised and humiliated” by the incident last week at a meeting attended by Jeremy Hunt, the health secretary, and Simon Stevens, the head of the NHS in England.

The leaders of about 60 trusts which NHS national bodies deemed to have the worst record on meeting the politically important four-hour A&E treatment target were called into a meeting held in London on Monday 18 September.

Chief executives present say that they were divided into four regional groups, covering the south and north of England, London, and the Midlands and east of the country, each of which held a separate session with a senior NHS England official.

Paul Watson, NHS England’s regional director for the Midlands and east of England, then encouraged those in the group he was leading to chant “we can do it” as part of a renewed effort to improve their A&E performance. Hunt and Stevens are not thought to have been at that session; nor was Jim Mackie, chief executive of health service regulator NHS Improvement, who jointly convened the meeting with Hunt and Stevens.

One chief executive said: “It was awful – the worst meeting I’ve been at in my entire career. Watson said: ‘Do you want the 40-slide version of our message or the four-word version?’ Everyone wanted the four-word version, obviously.

“He then said ‘I want you to all chant ‘we…can…do…this’. It was awful, patronising and unhelpful, and came straight after the whole group had just been shouted at over A&E target performance and told that we were all failing and putting patient safety at risk.”

According to the Health Service Journal, which revealed what had happened at the meeting, Watson told trust bosses that they were initially chanting too quietly and that they should chant the slogan again but louder, and “take the roof off” with the noise.

Watson’s use of the tactic has prompted complaints from within the NHS that the chanting was “Bob the Builder for NHS leaders”, after the children’s TV character Bob the Builder with his “Can we fix this? Yes we can” catchphrase. Another HSJ reader posted a comment on its website saying: “More akin to North Korea than the NHS”.

Anger and ridicule directed at Watson have prompted him to apologise for and explain his behaviour in messages he posted on the HSJ website since it published the story.

“If anyone found my session on Monday inappropriate in any way then I can only apologise – it was meant as light relief rather than brainwashing,” said Watson.

“As I said at Monday’s event, this can be done. If that seems cheesy or patronising then so be it but it does have the merit of being true – Paul”, he added.

He also repeated his claim that inadequate A&E performance endangered patients’ safety.

“It’s good to let off steam but let’s remember what’s at stake here: 1 Urgent care is the most basic service the NHS provides; 2 A badly run, crowded ED [emergency department] is a miserable experience for our patients; 3 These patients are often frail, elderly and frightened as well as very ill; 4 A crowded ED can be dangerous.”

If other trusts could provide excellent A&E services despite the rising demand for care, why could the 60 represented at the meeting not do that, he asked. He also angered trust bosses by saying that “the biggest single determinant of whether a struggling service is turned round is the confidence, optimism and determination of local leadership to do this and follow it through”.

The Guardian has approached NHS England and the Department of Health for comment.

Why we are hard wired to watch pornography | Daniel Glaser

The launch of David Simon’s new series The Deuce (starting on 26 September), has thrust pornography back into the spotlight. One of the most famous neuroscientific discoveries of the last decade probably plays a role.

This is the finding of a ‘mirror neuron’ in the cortex of a macaque monkey, so named because it fires both when the monkey sees an action and when it performs it – ‘mirroring’ behaviour it witnesses.

Cells in the human brain have been shown to exhibit similar behaviour. Dancers use their knowledge of movement to help them see it: to understand and enjoy it more. The implications for pornography are clear. Without such a system in the brain, explaining why people find watching sex arousing is difficult.

There have been a few studies demonstrating a correlation between mirror-system activation and erections in men, but it’s largely escaped systematic study. It’s hard to believe that this use has evolutionary significance, although some studies have shown that male monkeys will give up a certain amount of fruit juice to look at pictures of female monkeys’ bottoms.

Dr Daniel Glaser is director of Science Gallery at King’s College London

Mental health data shows stark difference between girls and boys

A snapshot view of NHS and other data on child and adolescent mental health reveals a stark difference along gender lines.

As reported earlier this week, the results of a study by University College London and the University of Liverpool show a discrepancy between the emotional problems perceived by parents and the feelings expressed by their children. Researchers asked parents to report signs of emotional problems in their children at various ages; they also presented the children at age 14 with a series of questions to detect symptoms of depression.

Graph showing that there is a discrepancy between self-expressed emotional problems in teens and problems reported by their parents


The study reveals that almost a quarter of teenage girls exhibit depressive symptoms. Data from NHS Digital, which examines the proportion of antidepressants prescribed to teenagers between 13 and 17 years old, shows that three-quarters of all antidepressants for this age group are prescribed to girls.

More than two-thirds of antidepressants prescribed to teenagers are for girls


Eating disorders are one of the most common manifestations of mental health problems, and are in some cases closely related to depression. A year-by-year breakdown of hospital admissions for eating disorders indicates that, while eating disorders in both boys and girls are on the rise, more than 90% of teens admitted to the hospital for treatment are girls.

Graph showing the difference between girls and boys admitted to hospital for eating disorders

Records also show hospital admissions dating back to 2005 for individuals under 18 years old who committed self-harm. While the numbers for boys have seen a smaller amount of variation with a general upward trend, the figure for girls has climbed sharply during the last decade, with the most significant jump occurring between 2012/13 and 2013/14.

Hospital admissions for self-harm are up by two-thirds among girls


Two of the most common methods of self-harm are poisoning and cutting. Self-poisoning victims are about five times as likely to be girls, and the number of girls hospitalised for cutting themselves has quadrupled over the course of a decade.

Most self-harm admissions involve cases of self-poisoning, which has risen drastically among girls
Self-harm hospitalisations involving girls cutting themselves have quadrupled since 2005


Although self-harm, depression, and other mental health problems are more commonly reported and identified in girls, suicide rates are far higher among boys. This data is consistent with research on differences found between men and women in methods used to commit suicide, the influence of alcohol, and other social or cultural factors.

Teenage boys are more than twice as likely to kill themselves as girls
  • In the UK the Samaritans can be contacted on 116 123. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.

Mental health data shows stark difference between girls and boys

A snapshot view of NHS and other data on child and adolescent mental health reveals a stark difference along gender lines.

As reported earlier this week, the results of a study by University College London and the University of Liverpool show a discrepancy between the emotional problems perceived by parents and the feelings expressed by their children. Researchers asked parents to report signs of emotional problems in their children at various ages; they also presented the children at age 14 with a series of questions to detect symptoms of depression.

Graph showing that there is a discrepancy between self-expressed emotional problems in teens and problems reported by their parents


The study reveals that almost a quarter of teenage girls exhibit depressive symptoms. Data from NHS Digital, which examines the proportion of antidepressants prescribed to teenagers between 13 and 17 years old, shows that three-quarters of all antidepressants for this age group are prescribed to girls.

More than two-thirds of antidepressants prescribed to teenagers are for girls


Eating disorders are one of the most common manifestations of mental health problems, and are in some cases closely related to depression. A year-by-year breakdown of hospital admissions for eating disorders indicates that, while eating disorders in both boys and girls are on the rise, more than 90% of teens admitted to the hospital for treatment are girls.

Graph showing the difference between girls and boys admitted to hospital for eating disorders

Records also show hospital admissions dating back to 2005 for individuals under 18 years old who committed self-harm. While the numbers for boys have seen a smaller amount of variation with a general upward trend, the figure for girls has climbed sharply during the last decade, with the most significant jump occurring between 2012/13 and 2013/14.

Hospital admissions for self-harm are up by two-thirds among girls


Two of the most common methods of self-harm are poisoning and cutting. Self-poisoning victims are about five times as likely to be girls, and the number of girls hospitalised for cutting themselves has quadrupled over the course of a decade.

Most self-harm admissions involve cases of self-poisoning, which has risen drastically among girls
Self-harm hospitalisations involving girls cutting themselves have quadrupled since 2005


Although self-harm, depression, and other mental health problems are more commonly reported and identified in girls, suicide rates are far higher among boys. This data is consistent with research on differences found between men and women in methods used to commit suicide, the influence of alcohol, and other social or cultural factors.

Teenage boys are more than twice as likely to kill themselves as girls
  • In the UK the Samaritans can be contacted on 116 123. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.

Mental health data shows stark difference between girls and boys

A snapshot view of NHS and other data on child and adolescent mental health reveals a stark difference along gender lines.

As reported earlier this week, the results of a study by University College London and the University of Liverpool show a discrepancy between the emotional problems perceived by parents and the feelings expressed by their children. Researchers asked parents to report signs of emotional problems in their children at various ages; they also presented the children at age 14 with a series of questions to detect symptoms of depression.

Graph showing that there is a discrepancy between self-expressed emotional problems in teens and problems reported by their parents


The study reveals that almost a quarter of teenage girls exhibit depressive symptoms. Data from NHS Digital, which examines the proportion of antidepressants prescribed to teenagers between 13 and 17 years old, shows that three-quarters of all antidepressants for this age group are prescribed to girls.

More than two-thirds of antidepressants prescribed to teenagers are for girls


Eating disorders are one of the most common manifestations of mental health problems, and are in some cases closely related to depression. A year-by-year breakdown of hospital admissions for eating disorders indicates that, while eating disorders in both boys and girls are on the rise, more than 90% of teens admitted to the hospital for treatment are girls.

Graph showing the difference between girls and boys admitted to hospital for eating disorders

Records also show hospital admissions dating back to 2005 for individuals under 18 years old who committed self-harm. While the numbers for boys have seen a smaller amount of variation with a general upward trend, the figure for girls has climbed sharply during the last decade, with the most significant jump occurring between 2012/13 and 2013/14.

Hospital admissions for self-harm are up by two-thirds among girls


Two of the most common methods of self-harm are poisoning and cutting. Self-poisoning victims are about five times as likely to be girls, and the number of girls hospitalised for cutting themselves has quadrupled over the course of a decade.

Most self-harm admissions involve cases of self-poisoning, which has risen drastically among girls
Self-harm hospitalisations involving girls cutting themselves have quadrupled since 2005


Although self-harm, depression, and other mental health problems are more commonly reported and identified in girls, suicide rates are far higher among boys. This data is consistent with research on differences found between men and women in methods used to commit suicide, the influence of alcohol, and other social or cultural factors.

Teenage boys are more than twice as likely to kill themselves as girls
  • In the UK the Samaritans can be contacted on 116 123. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org.

I live a healthier life now I’m free of the trappings of modernity | Mark Boyle

When people learn of my decision to reject modern complex technology in favour of older, slower, forgotten ways, their first line of inquiry usually involves healthcare. Considering its importance to our lives, this is hardly surprising. Yet because of its emotive nature – which of us, after all, doesn’t have friends or family needing glasses, hearing aids, stents or prescription drugs? – it seems difficult to have a calm, objective discussion on the subject.

The more concerned and curious inquirers often ask me what I would do if I got seriously ill. While the long answer is complicated and nuanced, honestly, I don’t know. It’s easy to live by your values when times are good, much harder when you’re having a stroke or dying of cancer.

One thing I can say with more confidence is this: if we continue pursuing this political ideology of mass industrialism – which has given us ambulances, dialysis machines, wheelchairs and antidepressants – not only will we continue to harm our physical, emotional and mental health (leading to even more people needing such things) we’ll also wipe out much of life on Earth.

Industrial civilisation, itself only 200 years old, is already causing the sixth mass extinction of species of the last half billion years. What’s that got to do with an ambulance? Well, both nothing and everything. The ambulance itself undoubtedly saves lives (including my dad’s). Yet deconstruct a single ambulance – with its plastics, oils, fluids, copper, acids, glass, rubber, PVC, minerals and steel – and I’ll show you how to lay waste to the very thing all our lives depend upon: the planet.

Big picture aside, most of what afflicts us today – cancer, obesity, mental illness, diabetes, stress, auto-immune disorders, heart disease, along with those slow killers: meaninglessness, clock-watching and loneliness – are industrial ailments. We create stressful, toxic, unhealthy lifestyles fuelled by sugar, caffeine, tobacco, antidepressants, adrenaline, discontent, energy drinks and fast food, and then defend the political ideology that got us hooked on these things in the first place. Our sedentary jobs further deplete our physical, emotional and mental wellbeing, but instead of honestly addressing the root cause of the illness we exert ever more effort, energy, genius and money trying to treat the symptoms and contain the epidemics.

We’ve developed Stockholm syndrome, sympathising with the very system that has economically held us hostage since the 18th century. Industrialism, along with its partner in crime, capitalism, has even persuaded us that, in order to save ourselves and loved ones from the horrors of disease we should spray every surface with chemicals, keep children’s hands out of the dirt and muck, and try to sterilise our entire world. With our immune systems compromised as a result, multi-billion-dollar pharmaceutical companies then sell us products to fend off what our bodies should be able to fight off naturally.

In their cleverness they have even persuaded us to pop painkillers for things that hardier generations would balk at. My own approach to healthcare won’t satisfy the critics, the advocates of this strange thing called progress that seems to have us all more stressed and less content. And that’s OK; I’m not trying to tell people what to do, and I’ve got no product to sell. I share it only because my editor tells me it’s the most common online inquiry.

In doing so I’m very aware that I’ve been blessed to be born without any serious long-term health issues, and that at 38 I’m relatively young. That said, I’m not convinced that it’s necessary to fall into such poor physical shape, as civilised peoples tend to do. My dad is almost 73 and he can still cycle 150km before dinner, simply because he has never stopped looking after his health.

The philosophy underlying my approach is that of any herbalist: keep the vitality in your body strong, and be mindful to do it every day. When it goes out of ease and into disease, use the appropriate plants – the original source of many industrial medicines – to bring your body and mind back into balance, and to restore optimal functioning. Your body is always aiming for balance and health, and listening to it is one of the best things you can do. Illness is feedback – the sooner you heed it and restore your vitality, the less likely it is you’ll develop more serious problems.

I find it impossible to describe my approach to health without describing my approach to life. I wouldn’t dream of suggesting that this is a prescriptive solution for anyone else; but with the exception of a voluntary vasectomy, I haven’t seen a doctor or nurse for 20 years.

I pick my own fruit and vegetables from the garden and hedgerows, and eat them as fresh, raw and unwashed as is optimal. I cycle 120km each week to lakes and rivers, where I then spend three evenings of that week relaxing and catching the following day’s dinner. I work outdoors, getting sweaty and dirty doing things I enjoy. I made the tough decision to live in the natural world so that I could breathe clean air, drink pure water and create life that allows others the same. I wash with water, and water only. I use no chemicals inside or outside the house. I wear as few clothes as I need, I use nothing electrical – no fridge, no screens, no phone. I avoid sugar, caffeine and stress like the plague.

Sleep comes and goes with the light – I find six hours of peaceful rest sufficient. If and when I do feel ill or out of balance, my girlfriend Kirsty (who illustrates these articles and is teaching herself herbalism) recommends a plant from our herb patch and I slowly feel vital again. She’s currently drying yarrow, horsetail, silverweed, self-heal, calendula and chamomile for the winter months.

I’ve suffered from hay fever – something becoming more common as CO2 levels in the atmosphere increase – since I was a child. These days I eat a handful of plantain leaves – a natural antihistamine – three or four times a day, and that sorts it. Plantain comes out just before hay fever season and goes to seed shortly afterwards, and is a common in the cracks of city pavements and lawns as it is in the countryside.

I appreciate that this may sound unrealistic to many. When I was working 60 hours a week in a low-paid job in the City, 10 years ago, it did to me too. I only managed to do it by stripping away modernity’s bullshit, learning to live with the land, and reducing my bills down to zero. Simplicity in these times is hard won, but I’ve found that it’s worth it.

I can only speak for myself, and I support everyone’s decision to care for their own health as they see fit. Ultimately, we’re all going to die and I wish to go out like the American writer and conservationist Edward Abbey: by taking off to the wilderness, where wildlife can feed on my dead body just as I have done on theirs. It seems only fair.

Two things, in this respect, I find important. One is that like Henry David Thoreau once remarked, I do not safely reach death and “discover that I had not lived”. Second, that I don’t cling to my own fading light so desperately that I extinguish it for all else. Like all good guests, it’s wise not to overstay your welcome.

This article was written by hand and posted to an editor at the Guardian, who typed it up to go online. Get in touch with Mark Boyle, the Guardian’s Living Without Technology columnist, here or in the comments below, a selection of which will be posted to him

Why we need the welfare state more than ever

Tucked away behind York Minster – the grand cathedral adorned with medieval stained-glass windows that dominates the North Yorkshire city’s skyline – is a cobbled street that has become an informal labour exchange. Each day, just before lunch, couriers dressed in the distinctive mint green and black uniform of Deliveroo, the online food delivery company, arrive at the end of this street, park their bikes and scooters next to a bench, and talk among themselves. Clutching their smartphones, they wait for someone, somewhere in the city, to place an order with one of the nearby restaurants and cafes. When an order comes through, one of the couriers will pick it up and deliver it in exchange for a small fee. They will then return to the bench to wait.

Plenty of people in early 21st-century Britain can identify with the experience of working for a company like Deliveroo. Drivers for the taxi firm Uber, for example, know only too well what it’s like for work to arrive in fits and starts via an app. But even more people are employed on zero-hour contracts in a wide variety of jobs, from stacking shelves to waiting tables to caring for the elderly. According to the Office for National Statistics, around 900,000 workers rely on a job with a zero-hour contract. These people start every week not knowing how much work they will get or how much money they will earn.

Informal or casual employment of this kind helps explain why Britain’s unemployment rate has not sky-rocketed since the financial crash of 2008. By contrast, almost a century ago, during the struggles of the 1920s and the Great Depression of the 30s, unemployment regularly climbed above 10%; at the most difficult moments, it went above 20%, with the true level – including those who were out of work but not officially registered as unemployed – even higher. Unemployment was also a serious problem – and one that suffered from the same difficulties of measurement – during the 1980s, when it climbed steadily to more than 12% during the early Thatcher years and, despite a steady decline, ended the decade at almost 7%. Despite the past decade seeing one of the slowest economic recoveries in history, unemployment has not got out of hand for long periods. After peaking at 8.5% in 2011, the rate has recently dropped below 4.5%.

The Conservative-led governments of the past seven years argued that declining unemployment rates are a sign that austerity is working. In the wake of the financial crash, in which banks collapsed and ATMs were hours away from refusing to dispense cash, David Cameron, George Osborne and their colleagues argued that there were too many skivers, sleeping off a life on benefits, while everyone else – the strivers, as they were labelled – trudged to work to support them. Cutting benefits would solve all manner of problems: it would get the skivers back to work, bring public spending down, and be good for the general health of the economy.

Like the unemployment statistics, these claims are deceptive. Millions of people are “just about managing”, to use a phrase the prime minister, Theresa May, was once fond of, and many are faring much worse. In the 12 months before March 2017, the Trussell Trust, Britain’s largest food bank charity, gave out more than 1m three-day emergency food parcels to people in desperate need. At the same time, as the Guardian has reported this week, debt has ballooned in the UK, returning to pre-financial crash levels, with household debt at 150% of income in 2015. This debt has been fuelled by low-to-no wage growth, inflated house prices and, thanks to historically low interest rates, credit made available for items such as cars. But the main issue for the estimated 8.3 million people living with unmanageable debt is needing to borrow money to survive.

Deliveroo cycle couriers waiting for orders in London.


Deliveroo cycle couriers waiting for orders in London. Photograph: Bloomberg/Getty

According to some commentators, much of this economic insecurity – a major contributor to the discontent that made Vote Leave’s slogan “take back control” so powerful in the EU referendum last year – is rooted in a profound set of changes taking place across western economies. Traditional ways of working and archaic vested interests are being challenged by new and powerful forces. The gig economy epitomised by the likes of Deliveroo and Uber, for example, is often talked about as “disruption”, with digital technology a new and irresistible means of transforming business practices and satisfying their customers. Tremendous entrepreneurial individualism and flexibility is being unleashed, the world just needs to catch up.

The difficulty with these arguments is that we’ve been here before. The sight of workers standing in large groups waiting for work would have been familiar to the residents of British cities such as York more than a century ago. Those workers were painfully aware that irregular and low-paid employment offered few guarantees. While they might be able to obtain enough work each day, week or month, they could be stopped in their tracks at any moment by injury or illness. For all their willingness to work, those casual labourers, like their successors now, might not be able to make ends meet – and while not troubling the unemployment figures, were at constant risk of falling into debt and destitution. Those earlier generations’ answer to their problems, however, was the welfare state – the very thing that successive governments have blamed for the country’s current situation.


Many of the issues at the heart of the current malaise in British politics can be traced back to the late 19th and early 20th centuries, when what we now call the welfare state was slowly being assembled. From a legislative perspective, the welfare state was initially focused on a specific problem that had grown since the early 1800s: that many workers struggled to earn regular and reliable wages throughout the entire year. But these labour market problems were believed to be bound up with other issues: squalor, ignorance, want, idleness and disease – the “five giant evils” William Beveridge identified in his famous report, published in 1942. Schools, hospitals, council houses and benefits for those out of work were just some of the threads woven together to create the tapestry of the modern welfare state. The unravelling of that settlement has seen a resurgence of the original problem governments tried to tackle more than a century ago.

The country had grown wealthy during the industrial revolution, via the financial might of the City of London, the manufacturing power of the north of England, and an enthusiastic embrace of free trade. The poor, however, had not disappeared. The poor law, established in 1601, at the end of Elizabeth I’s reign, made Britain’s guarantee of help for the destitute unique among European nations. In the 1830s, an influential group of reformers, who later would be known as “modernisers”, changed the terms on which that help was offered. Assistance should amount to less than what the lowest-paid labourers could obtain with their wages, reformers insisted. Furthermore, help should only be available to people who were prepared to live in a workhouse – a dark, dank and miserable place where they were given an ill-fitting uniform and forced to carry out menial tasks in exchange for shelter and meagre rations of the most basic food.

The theory was deterrence: make the poor law frightening and only the most desperate – those truly in need – would trouble the authorities and public purse. Yet theory had a difficult relationship with reality. By the closing decades of the 19th century, hundreds of thousands of people – a total close to the population of Liverpool – were still using the poor law every year. The number was so large that many local authorities could not accommodate them in workhouses and had to continue offering cash handouts or food, as had been the case before the 1830s. Who were these people and why were they still asking for help?

Kensington High Street in London circa 1895.


Kensington High Street in London circa 1895. Photograph: Heritage Images/Getty

Shipping magnate Charles Booth organised a survey of London in 1886, which collected information about what went on behind closed doors in the capital’s slums. Booth then divided London’s population into categories based on their economic means and – somewhat questionably to modern eyes – their habits and behaviour. The result was a shock for his middle-class readers: 30% of London’s population seemed unable or only just about able to meet the basic costs of living.

Booth’s research threw light on dark corners of Britain and implied that the poor were a much bigger group than even the government’s statistics on the number of poor law claimants suggested. Many people wondered if he was right. One was Seebohm Rowntree – a member of a York-based Quaker family that manufactured confectionery and prided themselves on being responsible employers. They tried to know all their workers’ names and introduced welfare schemes, including an eight-hour day. But this approach became more difficult as business boomed and their company grew during the late 19th century. When Booth’s report came out, they worried they were not close enough to their employees to know if Booth’s conclusions applied to them too.

Rowntree and his assistants went out on to the streets of York in 1897 to investigate. Armed with notebooks, they criss-crossed the city, frequently passing the place where Deliveroo couriers would congregate more than a century later. They visited more than 45,000 people in the following two years, asking how much they earned, what they paid in rent, what food they bought, and all manner of other questions about their lives. Rowntree made sure to compile information on wages from local employers and to consult the latest medical research on the number of calories men, women and children needed to consume every day. He used this information to draw a “poverty line” – a calculation of the goods and services an individual needed to survive in modern society and how much money they needed to acquire them – and figured out how many people fell below it.

To Rowntree’s surprise, Booth’s findings applied to York as well as London. But Rowntree did not agree with his static description of the poor. Booth’s classification included numerous sub-divisions and distinctions: those he considered criminal, morally weak and semi-savage were separated from the poor who had not displayed an obvious – and unacceptable – flaw, such as a weakness for drink. Rowntree, however, believed there was much movement between these categories. The poor seemed always to be with us, he explained, but the poor were not always the same people.


Rowntree identified what he called the “poverty cycle”. Many people earned enough money to support themselves, he argued. From time to time, though, their circumstances changed – they got married, had a child or a relative died. These quite ordinary events stretched resources, sometimes for just a few weeks, but often much longer. But when they were over, the pressure on household finances was lifted, meaning people rose above the poverty line. Nevertheless, there was always something around the corner, waiting to drag people back down again; most obviously old age, when all those years of being stretched to the limit and unable to save would take their final toll.

Social reformers and charity workers across the country observed similar patterns of interruptions to, and pressures on, people’s earnings throughout the late 19th and early 20th centuries. One of the most important vantage points was Toynbee Hall, a university settlement located between Whitechapel and Spitalfields in the East End of London, where a small group of Oxford graduates lived among the poor, doing voluntary work and social research, before taking up employment – often of a much more lucrative and prestigious kind – elsewhere.

Among Toynbee Hall’s residents between 1903 and 1905 was William Beveridge. Beveridge spent time in the East End working with the unemployed, observing their daily routines, assisting with schemes that aimed to get them back into work, and following caseworkers from charities. In the process, Beveridge had come to a number of important conclusions. One was that unemployment was “at the root of most other social problems” because society “lays upon its members responsibilities which in the vast majority of cases can be met only from the reward of labour”. The other was that conventional wisdom about the causes of unemployment was wrong. For some commentators, unemployment was a question of character and motivation. An increasingly large part of mainstream opinion certainly accepted that a reality of modern industrial capitalism was periods when there would be no work available for some people – because trades were seasonal, or markets fluctuated. But Beveridge believed even this was a superficial understanding of the issue.

William Beveridge circa 1944


William Beveridge circa 1944 Photograph: Hans Wild/Time Life/Getty

The biggest contribution to unemployment outside the downward slopes of the trade cycle, Beveridge argued, was the inefficiency of industry when it came to hiring workers. He asked readers of his book Unemployment: A Problem of Industry (1909) to imagine a scene he had encountered on many occasions: 10 wharves that each employed between 50 and 100 men per day, half of whom were regular staff and half of whom were reserves. While each wharf would experience similar high and low points in trade throughout the year, they were also likely to have their own individual fluctuations within those patterns. Anyone looking at the 10 wharves as a whole would not see these smaller deviations. The problem was that those smaller deviations were all that mattered to the reserve labourers walking from wharf to wharf asking for work each morning, because they meant the difference between them and their families eating, or going hungry.

If there was better communication and planning, Beveridge argued, almost all of those men would be able to find work each day. The problem was that business and industries were quite happy with the situation: they often had many more workers than vacancies, and did not need to pick up the costs of supporting those who couldn’t find work. Beveridge believed the state was the only institution with both the power to solve this problem and the interest in doing so. The political will to act on this conviction would have far-reaching implications for the millions of people who have found themselves out of work since. But we have slid backwards into a situation where precarious work paid by the hour is considered a sign of progress.


The Liberal party administrations that governed Britain before the first world war changed Britain for ever. They modernised the tax system, differentiating between earned and unearned incomes, and introduced graduated rates for the roughly 3% of the population who qualified to pay income tax. David Lloyd George, the Liberals’ charismatic chancellor of the exchequer, announced a “People’s Budget” in 1909 – one for “raising money to wage implacable warfare against poverty and squalidness”.

The principal aim of the budget was to tackle interruptions to earnings among the working-age population. Following the example of similar schemes in Germany, national insurance involved weekly contributions from three groups: workers, their employers and the state. All wage earners aged 16 to 70 and earning less than £160 a year who paid weekly contributions of four pence a week (three in the case of women) could claim sick pay for up to 26 weeks a year, and treatment from a government-approved doctor. But another aspect of the Liberals’ plans had not been tried anywhere else before: 2.25 million men in a number of trades and industries, such as construction and shipbuilding, where work could be brought to a halt by something as unpredictable as the weather, were to be enrolled in a scheme of compulsory unemployment insurance, which offered benefits of seven shillings a week for up to 15 weeks a year in exchange for contributions of two and a half pence a week.

These schemes had obvious limitations. Pensions were meagre; unemployment insurance was mainly for skilled men; health insurance excluded hospital care, and spouses and children. Almost everyone found something to be unhappy about. The British Medical Association complained about the prospect of doctors being forced to become government employees, while friendly societies, trade unions and private insurance companies thought the state was trying to force them out of business. Middle-class households resented being made to pay out to insure their domestic help. Moreover, Labour MPs complained about the contributory system. What about those who couldn’t pay in or who found themselves out of work for longer than 15 weeks? Why not follow the example of pensions and pay benefits to all out of general taxation?

The Liberal government recognised that national insurance on its own would not tackle interruptions to earnings; interventions into the economy would be required, too. Beveridge was drafted in by Winston Churchill, president of the board of trade, to help roll out a system of labour exchanges – another idea borrowed from Germany. A forerunner of the modern jobcentre, labour exchanges were an important part of the government’s plans for administering national insurance, with employers offered incentives to advertise vacancies in the exchanges and the unemployed asked to visit them to demonstrate they had looked for a job. For Beveridge, this had the potential to create an “organised fluidity of labour” that eliminated the kinds of problems he had observed in east London.

Migrant workers at a jobs noticeboard in west London.


Migrant workers at a jobs noticeboard in west London. Photograph: David Levene for the Guardian

Labour exchanges certainly helped some people find jobs, but they were never the dynamic sites of free-flowing information and recruitment that Beveridge imagined. Pensions and national insurance proved much more successful and durable, though. After the first world war, a succession of governments extended and reformed the schemes in significant ways. The result, building on the centuries-old guarantee of help for the destitute, was an imperfect yet impressive system that offered assistance to many, though far from all, people in times of need.

During the Great Depression of the 1930s, however, when the Labour prime minister Ramsay MacDonald was expelled from his own party after he pushed through a 10% cut to unemployment benefits, there was widespread concern that these schemes were unsustainable. Interruptions to earnings looked like a minor problem when many people feared the complete collapse of the global economic system. What Britain seemed to have, Beveridge later suggested, was a series of “patches” – things that could be sewn on to country’s tearing fabric, rather than solutions to its underlying problems. Perhaps capitalism – a system that treated unskilled workers without the fallback of insurance as dispensable – was the real problem. There had to be a way to manage the economy that would transform life for people in Britain and enable national insurance to offer genuine security to all.


The economist John Maynard Keynes was always clear about whose side he was on. “I can be influenced by what seems to me to be justice and good sense,” he explained, “but the class war will find me on the side of the educated bourgeoisie.” Although many of his contemporaries threw their lot in with Labour during the interwar years, believing they were the only realistic hope for progressive reform in an era of universal suffrage, Keynes stood firm. He was a Liberal and he intended to do everything he could to help the party – going so far as to help formulate its economic policy under Lloyd George’s leadership during the late 1920s and early 30s. Keynes was not alone. Beveridge might have kept his allegiance quiet in a bid to appear neutral, but in 1944 he won a byelection for the Liberals, and ran the party’s 1945 general election campaign.

Keynes cemented his status as the most important economist of the 20th century during the mid-1930s, when he published The General Theory of Employment, Interest and Money – the book that would serve as the set text for one side of the argument about how governments should respond to downturns and recessions. The world has revisited this argument regularly since the interwar years, including after the financial crash of 2008, when Nobel prize-winning economist Paul Krugman urged governments not to forget the lessons that Keynes had once taught them.

Thrashed out over five years of debate and discussion with his research students and colleagues at Cambridge, The General Theory is now widely known for a relatively small number of ideas and a simple message. Governments should resist the temptation to cut back during recessions, Keynes argued, because their root cause was a contraction in aggregate demand – the total amount of goods and services that are purchased in an economy – which collapsed when people and organisations, uncertain about the future, simultaneously chose to hold on to their money. Keynes explained how spending money had effects that rippled outwards in the economy, including the creation of employment, as the demand for goods and services increased. Governments should stop worrying about deficits and sound finance when times were bad – they could take care of them once everything was moving again.

Unlike Keynes, Beveridge was not invited to help run the economy during the second world war, and was disappointed not to be fully involved. By the summer of 1941, the government had tired of his sniping from the sidelines, and gave him a job they thought would keep him out of sight and mind for some time to come: a review of national insurance. In November the following year, he delivered the result: his ground-breaking report, Social Insurance and Allied Services.

The Beveridge Report on Social Services & Allied Services from 1942


The Beveridge Report on Social Services & Allied Services from 1942. Photograph: The National Archives

The Beveridge report was not quite what people were expecting, or what people now think it is. The British people did not want a “Santa Claus state” that handed out gifts to everyone, he argued; they wanted a form of economic and social security that reflected a history of paying into the system. Benefits should cover a much wider range of risks, Beveridge explained, but they should be simple to understand: everyone – workers, employers and the state – should pay flat-rate contributions and get flat-rate benefits paid out in return.

On the face of it, there was no reason for the government to be worried. Beveridge’s benefits scheme involved significant extensions and new responsibilities, but it also required people to pay hefty weekly contributions. However, Beveridge had made a number of recommendations, which he modestly called “assumptions”. The first two were an allowance for each child born (after the first) and a National Health Service, free for all people, both of which would be paid for out of general taxation. The other was that the government commit to a new way of running the economy: one in which they made sure that unemployment never went above 8.5%.

Beveridge had called these recommendations assumptions because he believed a reformed system of national insurance could not work without them. Unemployment had to be kept below 8.5% so that people could build up a history of contributions and the system did not collapse under the weight of demand when they needed help. Indeed, the easiest system to administer was one in which authorities could safely assume there were jobs for the vast majority of people.

Initially, the government was not as enthusiastic about committing to his recommendations as the public, who bought an astonishing 100,000 copies of Social Insurance and Allied Services during the month after it was published. The prime minister, Winston Churchill, refused to comment on the report for three months, and offered a vague endorsement when he eventually did. By the end of 1944, however, when victory over Germany looked certain, a series of white papers committed the British state not only to Beveridge’s plan, but also to a number of other new policies, such as a new secondary school system with a leaving age of 15. The roadmap for social reconstruction had been drawn.


The welfare state that came into being during the late 1940s underpinned a whole way of life that politicians only started to pull apart from the early 1980s onwards. The intention during the third quarter of the 20th century was to bring capitalism under control, specifically its tendency to interrupt and put downwards pressure on people’s earnings, rather than dispense with the system entirely. The Labour party, which won a historic landslide in the election of July 1945, put its mark on the whole project, in particular by nationalising whole swaths of industry. Yet, after half a century of debate and legislation, each political party had left fingerprints on the final product.

These points matter for a number of reasons. One is that we often assume the welfare state was a collectivist venture. But even strident individualists found reasons to support it. Indeed, the era of social democracy helped create successive generations of individualists, including the working-class people who suddenly found themselves socially mobile during the 1950s, 60s and 70s. Looking back, and quite understandably, those generations can often give in to the temptation to imagine this progress was all down to their own hard work. Yet, as sociologists such as John Goldthorpe have shown, these generations rode the economic and social wave created by the policies adopted after 1945. Economic growth expanded the middle class by creating new management-level jobs into which working-class people could move, in both the public and private sectors, meaning there was “more room at the top”. Moreover, in an era of full employment, home ownership started to rise, not only because new houses were built, but also because it was perfectly reasonable for banks to assume that people would hold down a job for 25 continuous years and therefore pay back any money they borrowed.

Could that strategy be repeated today? The answer that has been given repeatedly for the past decade – and in some cases longer – is no. We have come to see the welfare state simply as a cost to be kept down rather than part of an economic and social strategy that aims to deliver security for all and opportunities to obtain more for those who want to. The idea that these goals are no longer obtainable is clearly false. A good start would be to reconnect with the liberal idea, now more than a century old, that everyone sees returns when they pool risks, whether it’s the individuals who can stop worrying about what is around the corner, governments that might otherwise cut their headline costs but succeed only in shifting it somewhere else, or the companies that benefit from healthy and educated workers operating in a safe environment. A successful economy requires all these actors to understand that they need to give, not just take, in order to build an environment in which they and those that follow them are able to succeed.

Migrant workers waiting to be collected for farm and factory work in East Anglia.


Migrant workers waiting to be collected for farm and factory shifts in East Anglia. Photograph: David Levene for the Guardian

Are more radical measures required? In the long term, yes. The world has changed since the early 20th century: businesses and individuals behave differently and the “assumptions”, as Beveridge would have called them, that go with national insurance have evolved. The trend has been to pay for things by pushing the costs on to individuals, as has been done with university tuition fees. But there seems only so much mileage in this approach when debt is reaching dangerous levels, wages are stagnant and, as the economist Thomas Piketty has shown, income generated by wealth has increased rapidly for those lucky enough to have it.

One appropriate response would be to breathe new life into the radical strand of liberalism that differentiated between earned and unearned incomes back at the start of the 20th century. Piketty has argued for a global tax on wealth. But there are domestic policies that would go some way to achieving similar ends. We could consider applying capital gains tax to property – recouping some of the considerable profits that those generations who benefited from the welfare state have acquired from the houses they were able to buy, in part because of it.

Some commentators suggest what seem like even more radical ideas, such as universal basic income (UBI): a guaranteed regular payment for every citizen that would keep them above the poverty line, even if they chose not to work. UBI would deliver security, but faces numerous technical challenges, not least the significant differences in living costs across the country, which make a “universal” sum impossible to settle on, even before tackling the political problems of accusations that it would simply make everyone a benefit claimant. Yet versions of the idea have found support across the political spectrum, from neoliberals such as Milton Friedman to the leftwing economist and one-time Greek finance minister Yanis Varoufakis. For the left, a basic income would give people security and dignity. For the right, however, that basic security would be valuable because it would mean people would be free to take the kind of irregular work offered by the gig economy or zero-hours contracts. The lesson of these differences and convergences of opinion is that tackling economic insecurity need not be done at the expense of efficiency, competitiveness or innovation.

Main image: unemployed labourers waiting for work at a dockyard in March 1931. Photograph by Fox Photos/Getty

Bread for All by Chris Renwick is published by Allen Lane at £20. To buy it for £14 go to bookshop.theguardian.com or call 0330 333 6846. Free UK p&p over £10, online orders only. Phone orders min p&p of £1.99.

Follow the Long Read on Twitter at @gdnlongread, or sign up to the long read weekly email here.

A million tons of feces and an unbearable stench: life near industrial pig farms

Rene Miller pokes a lavender-frocked leg out of her front door and grimaces. It’s a bright April afternoon, and the 66-year-old Miller, with a stoic expression and a dark crop of curls, braces herself for the walk ahead.

Her destination isn’t far away – just a half-mile down a narrow country road, flanked by sprawling green meadows, modest homes and agricultural operations – but the journey takes a toll. Because as she ambles down the two-lane street, stepping over pebbles and sprouts of grass, the stench takes hold, an odor so noxious that it makes your eyes burn and your nose run. Miller likens it to “death” or “decomposition” to being surrounded by spoiled meat.

As bad as it is today, she says, it’s nothing compared with the way it is on a muggy afternoon in August, when the stink hovering in the stagnant, humid air can nearly “knock you off your feet”.

INDY

Still, Miller makes this trip often, to honor her family and pay her respects. She points ahead to her family cemetery, which sits just off Veachs Mill Road in Warsaw, an hour’s drive east from Raleigh. It’s a stone’s throw from her one-story, white-walled house, part of a tract of land her great-grandmother inherited as part of a post-slavery land grant. When she gets to the cemetery, she stops in front of her nephew’s grave, recalling his life and his death to cancer. Purple and yellow wildflowers nip at its edges; nearby, a Steelers flag rustles in the wind.

“How long have we lived here? Always,” she says, gazing at her grandmother’s headstone. “And we always will. Nobody else will ever live on this land.”

The odor isn’t just her problem. It’s ubiquitous across parts of eastern North Carolina. It’s the smell of hog country, of millions of pigs and even more tons of their feces. For years, their waste and its stink have been the subject of litigation, investigations, legislation and regulation. A growing body of research has documented the industry’s health and environmental risks. The issue has been well examined in the media, too. The New York Times and the Washington Post covered it. So have Dateline and 60 Minutes. The News & Observer earned a Pulitzer Prize for reporting on it in 1995.

But the stench – and its consequences, both for the lower-income, largely African American neighbors of hog farms and the state’s environment – lingers.

Recognized as ‘environmental racism’

Rene Miller is currently involved in a lawsuit against the hog farm which sprays hog waste onto a field across the street from her home.


Rene Miller is currently involved in a lawsuit against the hog farm which sprays hog waste on to a field across the street from her home. Photograph: Alex Boerner

To understand Rene Miller’s predicament, you have to start with the pigs.

Their population in North Carolina increased more than threefold in just one decade, from 2.8 million in 1990 to 9.3 million in 2000 – where it’s stayed, more or less, ever since.

In 1986, North Carolina ranked seventh in the country in pork production; 30 years later, it’s second only to Iowa, with an estimated 9 million pigs on 2,217 hog farms, according to the US Department of Agriculture’s quarterly hog survey and the 2012 US Census of Agriculture. The pigs have ushered in a $ 2.9bn-a-year industry that employs more than 46,000 people in North Carolina. But those hogs also produce millions of tons of feces. In one year alone, an estimated 7.5 million hogs in five eastern North Carolina counties produced more than 15.5m tons of feces, according to a 2008 report by the General Accounting Office.

Nowhere are the impacts more profound than in Duplin County, where Miller and about 2.3 million hogs live – more than anywhere else in the state, according to the Environmental Working Group, a research and advocacy organization.

A recent analysis of county and satellite data by the EWG found that roughly 160,000 North Carolinians live within a half-mile of a pig or poultry farm; in Duplin, nearly 12,500 people, more than 20% of its residents, live within that range. If you extend the radius to three miles, as many as 960,000 North Carolinians fall into that category. That’s nearly 10% of the state’s population.

For Miller, these numbers aren’t abstractions. They’re her life.

“That scent is so bad,” she says. “You can’t go outside. You can’t go outside and cook anything because the flies and mosquitoes take over.”

Within a mile of her property, Murphy-Brown LLC – a subsidiary of Smithfield Foods, the largest hog producer in the world – owns 5,280 hogs, according to the NC Department of Environmental Quality. Within two miles, there are more than 80,000 Murphy-Brown-owned hogs at seven different farms, according to a lawsuit Miller filed in 2014.


That scent is so bad. You can’t go outside. You can’t go outside and cook anything because the flies take over

Rene Miller

Fifty yards from Miller’s family graveyard is a massive open-air cesspool storing the pigs’ waste – a stagnant pool containing their feces, urine, blood and other bodily fluids – often referred to as a “lagoon”, one of about 3,300 lagoons across the state. When the cesspool reaches its capacity, its contents are liquefied and sprayed into a field across the street from Miller’s house via a large, sprinkler-like apparatus. The sprayer releases a mist of waste on to the field, which, according to court documents, is about 200ft from Miller’s home at its closest rotation.

That system prevents the cesspool from overflowing, but Miller says it also makes her life miserable.

It’s more than just the smell, she says. The liquefied waste mist drifts on to her property, and “dead boxes” filled with rotting hogs sit near her family’s cemetery, attracting buzzards, gnats and swarms of large black flies. After spending time outside, she says, her eyes burn and her nose waters.

She says she also suffers from asthma, which she began to develop shortly after she returned to her childhood home from New Jersey in the late 80s to care for her ailing mother.

Research published by the late Steven Wing, a professor of epidemiology at the University of North Carolina’s Gillings School of Global Public Health, linked similar health concerns to proximity to hog farms.

Wing, who passed away in November, described his research in a 2013 TED Talk:

“In 1995, I began to meet neighbors of industrial hog operations,” he said. “I saw how close some neighborhoods are to hog operations. People told me about contaminated wells, the stench from hog operations that woke them at night, and children who were mocked at school for smelling like hog waste. I studied the medical literature and learned about the allergens, gases, bacteria, and viruses released by these facilities – all of them capable of making people sick.”

Young hogs are gathered in pens at Butler Farms in Lillington, NC. The hogs live on slatted flooring which their waste is washed through and gathered before being pumped into covered lagoons.


Young hogs are gathered in pens at Butler Farms in Lillington, NC. The hogs live on slatted flooring which their waste is washed through and gathered before being pumped into covered lagoons. Photograph: Alex Boerner

Wing’s research showed a correlation between air pollution from hog farms and higher rates of nausea, increases in blood pressure, respiratory issues such as wheezing and increased asthma symptoms for children and overall diminished quality of life for people living nearby.

“Air pollutants from the routine operation of confinement houses, cesspools, and waste sprayers affect nearby neighborhoods where they cause disruption of activities of daily living, stress, anxiety, mucous membrane irritation, respiratory conditions, reduced lung function, and acute blood pressure elevation,” Wing and fellow UNC researcher Jill Johnston wrote in a 2014 study.

They also found that the state’s industrial hog operations disproportionately affect African Americans, Hispanics and Native Americans. That pattern, they concluded, “is generally recognized as environmental racism”.

The environmental racism argument has won some powerful allies, including US Senator Cory Booker, a New Jersey Democrat who in a recent podcast interview denounced the North Carolina hog industry, which he called “evil” for exploiting its African American neighbors.

“They fill massive lagoons with [waste] and they take that lagoon stuff and spray it over fields,” he told Pod Save America, recalling a trip to North Carolina late last year. “I watched it mist off of the property of these massive pig farms into black communities. And these African American communities are like, ‘We’re prisoners in our own home.’ The biggest company down there [Smithfield] is a Chinese-owned company, and so they’ve poisoned black communities, land value is down, abhorrent … This corporation is outsourcing its pain, its costs, on to poor black people in North Carolina.”

Booker, whose father grew up in Hendersonville and graduated from NC Central, told the INDY in a statement: “I saw firsthand in North Carolina how corporate interests are disproportionately placing environmental and public health burdens on low-income communities of color that they would never accept in their own neighborhoods. In North Carolina, large corporate pork producers are mistreating small contract farmers and externalizing their costs on to vulnerable communities, polluting the air, water, and soil, and making kids and families sick while reaping large financial rewards.

“And unfortunately, we know this is not just a problem in North Carolina. Similar environmental injustices are occurring right now all over the United States. This is unacceptable to me, and I’m in the process of finding ways for the federal government to start to meaningfully address this problem.”

In May, US Representative David Price, a Democrat who represents parts of Wake and Orange counties, took his own stab at a legislative solution. He introduced the Swine Act, a bill intended to improve environmental standards for North Carolina’s hog industry.

“It’s a problem our state has a huge stake in,” Price says. “It’s a matter of finding the political will to get ahead of the curve here. Because if we don’t do something like this, if we don’t get these farms on to a sounder waste-disposal system, we’re going to live to regret it.”

While Price’s bill is currently languishing in committee, this issue is already making its way through the courts.

Three years ago, Miller and more than 500 other North Carolina residents, mostly poor and African American, filed 26 federal lawsuits against Murphy-Brown, alleging its behavior adversely affects their health and quality of life. The lawsuits argue that Murphy-Brown’s parent company, Smithfield – which was purchased by the multinational Chinese corporation WH Group in 2013 for an estimated $ 4.7bn – has the financial resources to manage the pigs’ waste in a way that minimizes the odor and nuisance to nearby property owners.

The industry dismisses these claims.

“North Carolina’s hog farmers are under a coordinated attack by predatory lawyers, anti-farm activists and their allies,” Smithfield Foods told the INDY in an email. “The lawsuits are about one thing and one thing only: a money grab.”

Smithfield points to the fact that between 2012 and 2016, the DEQ only received 25 odor complaints, and of those, none resulted in fines or notices of violations.

“More than 80% of hog farms are owned and operated by families,” Smithfield argues. “They produce good products, they do it the right way, and they strive to be good neighbors.”

Other industry advocates have also alleged that greed is at the heart of these claims. Hog farmers are conscientious neighbors, the industry argues. And Smithfield and NC Pork Council, a trade group funded by commercial hog operations, both point out that the lawsuits don’t ask farmers to change specific behaviors. The NC Pork Council, a trade group funded by commercial hog operations, has blamed the lawsuits on avaricious attorneys who “like to sue farmers for as much money as possible”.

“Most farmers live on or adjacent to their farms and work hard to take good care of the land,” says Andy Curliss, the CEO of the Pork Council. “They are an integral part of the communities in which they live. They do things the right way and strive to be good neighbors.”

In an email, Mark Anderson, an attorney representing Murphy-Brown, says the company “is aggressively contesting the plaintiffs’ claims. After careful study, we concluded that the claims are not valid and have no merit.”

But Miller says she knows what she’s experienced – and that life on Veachs Mill Road has deteriorated since the hog houses came.

“Right now,” she says, “my life is the worst it’s ever been.”

The cases are pending in federal court. No trial dates have been set.

Boss Hog’s crown jewel

North Carolina’s pork production industry has shifted dramatically since the mid-80s. Today’s industrial farms, often called concentrated animal feeding operations, or CAFOs, raise pigs and other livestock in confinement until they are ready for slaughter. The hogs generally live in cramped quarters; Michelle B Nowlin, supervising attorney of Duke’s Environmental Law and Policy Clinic, estimates they typically get seven or eight square feet of space each.

When they have to relieve themselves, slatted, slanted floors filter their waste into pits that feed into open-air cesspools that sit just behind the hog houses. These pools, known as lagoons, come in muted tones of brown and sometimes Pepto-Bismol pink, courtesy of a cocktail of chemicals and pig waste.

The move from small family farm to massive commercial operation didn’t happen overnight. Starting in the 80s and early 90s, a new method of pig farming began to take hold as corporate hog producers and CAFOs began replacing independent, family-owned farms. In an arrangement known as contract farming, many larger companies bought family farms or merged with them by providing pigs in exchange for land and waste management services.

As the state gravitated toward a corporate production model – one already in place for the poultry industry – thousands of independent farmers left the business. According to census data, the number of farms in the state fell from more than 11,000 in 1982 to 2,217 in 2012.

Nobody was more influential in reshaping the industry than Wendell Murphy, a powerful Democratic state legislator and the subject of the News & Observer’s Pulitzer Prize-winning “Boss Hog” series. Murphy, a high school agriculture teacher turned farmer from Rose Hill, grew to become the nation’s top hog producer during his tenure in the general assembly, from 1982–93. While in office, he backed legislation to provide poultry and hog farmers with tax breaks and exemptions from environmental regulation, helping “pass laws worth millions of dollars to his company and his industry”, the News & Observer reported.

This included the 1991 “Murphy Amendment”, which exempted poultry and animal operations from stricter regulations on air and water pollution, and a 1991 bill that barred counties from imposing zoning restrictions on hog farms. In 1986, he voted in favor of a bill that eliminated sales taxes on hog and poultry operations.

By 1995, Duplin County was home to more than 1 million hogs, more than six times the number it had when Murphy was first elected. Most of them, the News & Observer reported, belonged to none other than Murphy Family Farms, Boss Hog’s crown jewel. Five years later, Murphy Family Farms was acquired by Smithfield, and its name was changed to Murphy-Brown LLC.

With the acquisition, Smithfield became the world’s largest hog producer.

‘Heaven 4 Hogs, Hell 4 Humans’

Former hog farmer Don Webb, of Stantonsburg, is critical of the way hog farmers dispose of waste.


Former hog farmer Don Webb, of Stantonsburg, is critical of the way hog farmers dispose of waste. Photograph: Alex Boerner

To an outsider, it might seem like business as usual. Murphy played politics and his company came out for the better. But for Don Webb, a living history book on all things related to hog farming, something was amiss.

A former hog farmer, Webb bore witness to the industry’s explosive growth firsthand. He grew up on a farm in Stantonsburg and cropped tobacco and picked corn with his bare hands. His father sold pigs right off the family farm. After a brief stint as a PE teacher, Webb started a successful hog farm in Northampton County in the mid-70s. But after an up-close experience with pig waste management, Webb has since become a thorn in the industry’s side.

Webb, 76, has a thick drawl and is prone to impassioned rants. Next to a chair inside his house, he sets an otherwise unassuming leather briefcase decorated with bumper stickers bearing such slogans as “Welcome to North Carolina: Heaven 4 Hogs, Hell 4 Humans”.

“They say they love America, but they really love somethin’ else,” he scoffs. “It’s green. How many hog pens have you found next to a country club?”

Webb’s transition from hog farmer to fuming activist was years in the making. He got into hog farming at the urging of a friend. At his farm’s peak, he had about 4,000 hogs – a small number by today’s standards, but enough to turn a handsome profit. He managed the pig waste similarly to how today’s farms do it. The slanted floors of his farm’s hog house filtered the waste into cesspools. When those filled up, they sprayed it elsewhere.

It was a matter of conscience that charted his current path, he says. Several of his neighbors told him that the stench from his farm was making their lives miserable. They grumbled about being quarantined indoors on sweltering summer evenings, unable to go outside on account of the pungent, fly-infested air.

“I said, ‘Suppose that was my mama and daddy back there,’” he says. “’How would I feel?’ I hit the brakes on that truck.”

In 1979, after about five years in the business, Webb sold his hogs and relocated to Cape Hatteras, where “the air was fresh”. When he returned to Duplin County six years later, however, he was greeted once again by that stench. That’s what ultimately turned him into an activist, he says.

“These are human beings,” Webb says. “They’ve worked their whole lives and are tryin’ to have a clean home and a decent place to live, and they can’t go on their front porch and take a deep breath.”

While legal, hog farms’ longstanding practice of disposing of excess waste by spraying it as mist on to nearby fields has proven controversial. Farms’ neighbors have complained that the system literally brings excrement to their doorsteps, allowing the liquefied waste to ride the wind to their property. In May, Shane Rogers, a former EPA and USDA environmental engineer, published a report that concluded that this is exactly what happens.

The study, which was filed in court documents on behalf of plaintiffs suing Murphy-Brown, relied on both air and physical samples collected from the exteriors of homes located near Murphy-Brown hog fields. The homes were selected randomly, and “at every visit and every home, I experienced offensive and sustained swine manure odors to varying intensity, from moderate to very strong,” Rogers wrote.

To test for the presence of pig-manure DNA, Rogers and his team collected DNA swab samples from the exterior walls of homes and from the air itself. In total, they collected 31 samples from the outside walls of 17 homes and submitted them for DNA testing; 14 of the 17 homes tested positive.

Additionally, all six of the dust samples taken from the air “contained tens of thousands to hundreds of thousands of hog feces DNA particles”, Rogers wrote, “demonstrating exposure to hog feces bioaerosols for clients who breathe in the air at their homes. Considering the facts, it is far more likely than not that hog feces also gets inside clients’ homes where they live and where they eat.”

Anderson, Murphy-Brown’s attorney, disputes the notion that the company’s farms and contractors disrupt their neighbors’ quality of life.

“Murphy-Brown requires all of its company farms to operate properly and in compliance with strict state regulatory requirements,” he wrote in an email. “We expect the same of all contract growers. Even more, we expect all farmers to be good neighbors. If any neighbor has a problem with a farm, tell us and we will do our best to fix it.”

Promised a ‘pie in the sky’

In an interview, state representative Jimmy Dixon, a former poultry farmer and a Duplin County Republican who is perhaps the hog industry’s most outspoken ally in the general assembly, makes three fundamental beliefs abundantly clear.

One, he believes the Murphy-Brown plaintiffs’ claims are “at best exaggerations, at worst misrepresentations” and they’re being “recruited” by greedy lawyers who have “promised a pie in the sky”.

“For people to say they can’t go outside, ‘I can’t barbeque, I can’t invite my neighbors over,’ those are exaggerations,” Dixon says.

Two, he doesn’t buy studies that point to hazards associated with hog farms because “a lot of these studies, a lot of them, begin with the end product in mind, and then they construct it for the outcome”.

And three, he doesn’t think any additional regulations are necessary. What’s more, he’s frustrated that critics don’t acknowledge the industry’s waste management improvements over the last 40 years, which he calls “unbelievable”.

He’s been no less forthcoming in his public comments.

On 5 April, Dixon stepped past dozens of protesters into a crowded committee meeting inside the legislature. He was there to defend his controversial pet project, House Bill 467, which would cap the amount of money that property owners living near “agriculture and forestry operations”, including hog farms, could collect in nuisance lawsuits.

Under HB 467, people could only collect damages equal to the reduction in their property’s fair market value – which critics argue is already low thanks to the presence of the nearby farms. One Democratic representative estimated that, if Dixon’s bill passed, property owners could only recoup around $ 7,000 over three years.

Importantly, the bill didn’t just seek to limit future nuisance lawsuits. It would also have negated the 26 pending claims against Murphy-Brown.

Introducing the bill, Dixon said it “seeks to promote farming by clarifying and adjusting the maximum compensatory damages that can be awarded”.

Throughout the 40-minute committee discussion, Dixon’s arguments were met by an admixture of support, anger and skepticism. Representative Amos Quick III, a Democrat , questioned Dixon about the bill’s discriminatory impact, “because the plaintiffs are predominantly African American”.

Mark Dorosin of the UNC Center for Civil Rights drilled down on that point during public comments, citing research showing that the proportions of African Americans, Hispanics and Native Americans living within three miles of industrial hog operations are 1.5, 1.39 and 2.18 times higher, respectively, than the proportion of white residents.

A few days later, following heightened media scrutiny, the bill’s opponents scored a victory. The contentious provision invalidating the pending lawsuits against Murphy-Brown was stripped from the bill. With it gone, HB 467 cleared the house easily, then the senate.

On 5 May – the same day Rogers’s study showing the presence of pig fecal matter on the exteriors of homes near hog farms was filed in court – Governor Roy Cooper vetoed the bill, saying he opposed “special protection for one industry”.

The hog industry fought back. In addition to its eight registered lobbyists, Smithfield enlisted the services of Tom Apodaca, a former senator from Hendersonville. Its efforts paid off. On 10 May, the house voted 74–40, mostly along party lines, to override Cooper’s veto. The following day, the senate followed suit. HB 467 became law.

Two weeks later, Murphy-Brown filed a motion in federal court, asking a judge to apply HB 467 retroactively, thus negating the pending lawsuits despite the fact that the legislature had explicitly voted to remove that provision from the bill. The court has not yet ruled on that motion.

Deep financial ties exist between HB 467’s backers and the hog industry. Cumulatively, house Republicans who supported HB 467 have received more than $ 272,000 in campaign contributions from the industry throughout their careers, according to an INDY analysis of campaign finance records. Dixon has received $ 115,000, including $ 36,250 from individuals associated with Murphy-Brown and $ 9,500 from the Pork Council. The house speaker, Tim Moore, has garnered $ 44,650. Senator Brent Jackson, who sponsored the senate companion bill to HB 467, has received more than $ 130,000 from industry associates.

Everything has gone downhill

If you Google “hog farms and North Carolina” you’ll see one name pop up again and again: Elsie Herring.

Wallace, North Carolina - Friday April 7, 2017 -


Elsie Herring. Photograph: Alex Boerner

A copper-haired 69-year-old, Herring lives on the same property her mother, the daughter of a slave, lived on for 99 years. Herring’s childhood memories are built around her family’s land in Wallace – of growing and farming tobacco, cucumbers, soybeans, strawberries, and peanuts; of canning food; of smoking and curing meat. Even though those were Jim Crow days, she remembers it as a “happier, healthier time. Everything was segregated, but we still got along. But now, after these hogs came in, everything has gone downhill.”

Herring’s home is adjacent to a farm that contracts with Murphy-Brown to raise 1,180 of its pigs, according to DEQ records and a lawsuit she filed. The lawsuit contends that the hog facility began spraying liquefied waste in the mid-90s and planting trees between their properties to act as a buffer, which proved ineffective.

Herring says her grandfather purchased the property in the 1880s from his aunt, who was white and his slave mistress. Her parents built the home she now lives in; her mother, father, brother and sister all lived on the land until they passed away.

Herring came back to Wallace from New York in 1993 to look after her elderly mother and her brother, who had Down’s syndrome. About two years after she moved back, the spraying began, Herring says. She vividly remembers the time it happened – an otherwise uneventful Saturday evening.

“We were just sitting here having our Saturday evening like we usually do, enjoying,” she recalls. “And in a short time, we heard this bursting sound, and then all of a sudden it started stinking like nothing you’ve experienced.”

Herring felt like she was going to be sick, so she went back inside.

“If you would have stayed out there,” she says, “you would have probably had to end up going to the hospital because this stuff was being released and you’re breathing it in.”

After that, Herring says, the spraying happened “all day, every day”.

The stench became so unbearable that Herring eventually contacted the Duplin County sheriff’s office, the Duplin County department of health, and the NC department of environmental and natural resources for help – all to no avail, she says. She became involved in local activist networks, joining the NC Environmental Justice Network and the Warsaw-based organization Rural Empowerment and Community Health, or Reach.


Everything was segregated, but we still got along. But now, after these hogs came in, everything has gone downhill

Elsie Herring

In 2007, her activism took her to the lawn outside the general assembly, where she joined other Reach members to protest about the effects of hog farming for more than 50 consecutive hours. According to her lawsuit, Herring “called or wrote letters, or both, to the Governor, the state and local health departments, the Attorney General of North Carolina, the United States Justice Department, DENR, the local sheriff and police departments, the county commissioners, the federal EPA, her congressman, and the owner of the hogs [Murphy-Brown]”.

Though the spraying has subsided over the past few months – perhaps as a result of the lawsuits, Herring says, though she can’t be sure – life is still “no picnic”. She ticks off a list of issues she believes the stink and the spraying have brought: flies, mosquitoes, mice, poisonous snakes. To avoid the odor, she stays indoors.

“It’s like living in prison,” she says.

HB 467 came as a surprise, she says. But, to her, its motives were transparent. Like many of her fellow activists, she’s all too aware of the racial dynamics at play.

“This is environmental racism,” she says. “This is my family land. And I’m sure race played a part when they decided they wanted to develop this area.”

Herring sighs. “We’ve been asked many times, ‘Why don’t you just move?’ Move and go where? I don’t want to move. I never knew my grandfather, but I know he walked on this ground. And his family.”

She pauses and looks at her house.

“It’s my land.”

Out-of-towners

Throughout the debate over House Bill 467, the arguments proffered by state representative Jimmy Dixon and other supporters centered on “hardworking farm families” besieged by “frivolous lawsuits” filed by greedy, out-of-state attorneys.

“You do know that the original lawyers were banned from North Carolina,” Dixon told the INDY, while shrugging off the plaintiffs’ claims about the stench associated with hog farms as “exaggerations”.

Mark Anderson, an attorney representing Murphy-Brown LLC, brought up the same point in an email: “You should know that the original claims were filed by out-of-state lawyers who went door to door, actively recruiting plaintiffs and promising them large sums of money if they joined the lawsuits. The lawyers’ conduct led to them being thrown out of the cases because of ethics violations.”

The out-of-state-lawyers claim is a go-to for the bill’s champions, and for good reason: it’s entirely accurate.

The Salisbury-based Wallace & Graham is now handling the plaintiffs’ 26 federal nuisance lawsuits against Murphy-Brown, a subsidiary of Smithfield Foods, but that wasn’t always the case. The suits were initially filed in Wake County superior court in 2013 by two out-of-state firms, whose lawyers recruited clients in North Carolina without a state license and signed hundreds of clients to contracts requiring them to pay hundreds of dollars an hour for work performed on their behalf, even if the attorneys decided to drop the case, as the News & Observer previously reported.

In hearings, Judge Donald Stephens admonished the firms for their behavior and, after the contracts were rewritten, required that they partner with a North Carolina firm, which ended up being Wallace & Graham. About two months after the firms teamed up, Wallace & Graham’s attorneys told Stephens they could no longer work with the out-of-towners. Stephens then took away the out-of-state lawyers’ privilege to practice in his court and said he didn’t “ever want to see them again or hear from them again”.

Wallace & Graham refiled the lawsuits in federal court, but the behavior of the attorneys booted from the case has left its mark, giving the industry’s defenders ammunition with which to accuse the Murphy-Brown plaintiffs or their attorneys of cupidity.

In a statement to the INDY, Smithfield Foods called the lawsuits a “cash grab”. At a house committee meeting in April, Dixon accused the plaintiffs’ current lawyers of manipulating their clients: “When the final chapter is written on these cases, we’ll see the people being represented are being prostituted for money.”

In a statement, Wallace & Graham said: “We are proud to represent our clients in this litigation. They look forward to having their day in court.”

But that: “Until the trial is over, we choose to make no further comment on the cases.”

This story was first published by Indyweek – news, culture & commentary for Raleigh, Cary, Durham, Chapel Hill. Read Part 2 and Part 3.

HPV screening better at detecting cervical cancer than pap smear, trial shows

Australia’s new national cervical cancer screening program has received a boost, with a large clinical trial showing screening for the human papillomavirus is significantly better at detecting potential precancerous cells than the traditional pap smear.

“We found that the HPV test was substantially more effective at picking up high-grade abnormalities compared to the pap test,” said Prof Karen Canfell, director of research at Cancer Council New South Wales.

The results, published in the international journal PLOS Medicine, have been released less than three months before the transition to the new national cervical cancer screening program that eliminates the need for women to have a pap smear every two years.

From 1 December, women from the age of 25, instead of 18, will be required to have a five-yearly HPV test, replacing the two-yearly pap test.

Previous estimates have suggested the new screening program would lower cervical cancer incidence and mortality by at least 20% owing to the more accurate test. But until now it had not been tested among women with a high uptake of the HPV vaccine.

To examine its effectives, researchers at Cancer Council NSW and the Victorian Cytology Service compared the detection rates of high-grade cervical abnormalities with the pap test among 5,000 women aged 25 to 64. The women attended a routine screening at 47 participating clinics in Victoria.

They were randomly assigned to either have a five-yearly HPV screening test or a 2.5-yearly liquid-based cytology screening (pap test).

The study found that compared with the Pap test, HPV screening “significantly” increased detection of high-grade precancerous cervical lesions among those who had been vaccinated. The overall detection rate was 0.1% versus 2.7%, respectively.

“These findings provide initial support for the implementation of primary HPV screening in vaccinated populations,” the authors wrote.

Canfell says increased detection means greater longer-term protection against the development of invasive cancer.

“This adds to existing evidence about how much more accurate and effective HPV screening is,” she said. “We now have a superior method for detecting not just the virus that causes cervical cancer, but also high-grade abnormalities.”

The second phase of the Compass trial is recruiting and hopes to have more than 120,000 participants.