sabato 4 luglio 2009

Breadline USA: Why People Are Going to Starve

Breadline USA: Why People Are Going Hungry in the Land of Plenty

By Sasha Abramsky, PoliPoint Press. Posted July 4, 2009.


America's poor are being priced out of a market flush with excess eatables. It's an abomination we can fix.

From Breadline USA: The Hidden Scandal of American Hunger and How to Fix It © 2009 by Sasha Abramsky. Reprinted with permission from PoliPointPress, LLC, Sausalito, CA.

When the Month is Longer Than the Money

Billy MacPherson believed that for many of her friends and pantry clientele “the months are longer than the money.” What little income they brought in each month— from work, from Social Security or disability checks, in food stamps or welfare payments— was never quite enough to last a full four-plus weeks. And so they faced an unpalatable choice: try to stretch the family budget to cover the whole month, which involved scrimping on food and missing meals throughout the entire period, or eat semi-decently for the first two or three weeks of the month and pray that something, somehow, would come about to tide them through the lean times at the end.

Once gas prices started going up, food prices also headed north— at least in part because so much corn and arable land was diverted into biofuel production in response to the energy crunch; in part, too, because oil-based fertilizers soared in price and inflation took root throughout the broader economy. In the last years of George W. Bush’s presidency, that lean period at the end of each month began to grow. Instead of a few days, it became a week; then it became ten days, even two weeks. For low-income Americans, wages and government checks lagged far behind inflation, leaving them little choice but to watch as month after month their never particularly munificent purchasing power collapsed.

In the years following 2005, as the price of staples such as wheat and rice more than doubled, deadly food riots broke out in Bangladesh, Haiti, Cameroon, Yemen, Mexico, Egypt, Burkina Faso, and several other countries. People earning one or two dollars a day were facing starvation caused not by drought or plagues of locusts but by the workings of the international commodities market. In some nations, governments were brought to their knees by the disturbances; in others, panicked ministers met in emergency sessions to limit crop exports and try to shore up their populaces’ food supplies.

By 2008 America’s impoverished classes were, albeit to a lesser extent, facing a similar price-induced hunger. Unlike the destitute of countries such as Ethiopia and the Sudan, who too often went hungry because crops failed and what little food the was got bought up by their richer neighbors, America’s poor were being priced out of a market flush with excess eatables. Theirs was a hunger amid plenty, an inability to buy their way to seats at the most food-laden table in history. At the same time as hungry Milwaukee residents— on false rumors of free food deliveries— were fighting each other for access to hoped-for supplies in the spring of 2008, at the same time as immigrant shoppers in many neighborhoods were stampeding to buy up large bags of rice in the face of rising prices, hot dog–eating and fried asparagus–eating competitions were gaining in popularity from the Coney Island boardwalk in New York to the agricultural town of Stockton, California. One visit to any of these binge-eating orgies would have been enough to put paid to the notion that American hunger, twenty-first-century style, was in any way about the country as a whole facing food shortages. Yes, food prices were rising, but they were rising due to increased energy costs and growing global demand for American food exports rather than in response to a collapse in the nation’s food supply. The country’s growing epidemic of hunger was less a symptom of food market contractions and more one of the stealth spread of poverty and inflation into more and more corners of American life.

The U.S. government’s official poverty line in 2008 was $10,590 for a single person, $13,540 for a couple, $16,530 for a family of three, and $21,203 for a family of four. And the Census Bureau estimated that over 37 million Americans (including noncitizen residents) were living at or below these income levels. But that only hinted at the growing scale of American poverty. Economists such as Bob Pollin, codirector of the Political Economy Research Institute at the University of Massachusetts, believed many tens of million Americans more were living on incomes that, while they might meet a denuded government “minimum-wage” threshold, in reality couldn’t be expected to meet a family’s basic needs.

Pollin’s team calculated that a single person needed to earn ten dollars an hour to achieve even a semblance of economic security; and, as with the poverty line, so with this measure, which he called a “living wage,” the dollar amount would go up as the number of people in the family increased.

Guaranteeing a living wage was an ambitious goal, one that a number of localities had been trying to implement since the mid-1990s, when Baltimore’s city council passed a limited living-wage bill that impacted about fifteen hundred local workers employed by companies who did business with the city. And nowhere were such local measures more of a hot-button issue than in Santa Fe, New Mexico.

In the late winter of 2006, Santa Fe’s then-mayor David Coss sat behind his large desk discussing the city’s living wage, his long, wiry body draped in an expensive gray-brown linen suit, a cream shirt and dark-patterned tie, his hair neatly coiffed, his graying goatee smartly trimmed. A Georgia O’Keefe poster of a horned animal’s skull hung on the wall behind him. A second poster, in pastels, showed off a glorious Southwestern desert and mountain landscape— a world of swirling dreams and endless possibilities. Coss had a background as an environmental scientist and a union organizer; he had risen to power at City Hall at least in part because of his assertive championing of the most comprehensive living-wage statute in America.

Three years earlier, after a decade-long campaign by social justice activists, seven of the eight councilmen in the chic— and expensive— desert town voted to raise the city’s minimum wage to $8.50 an hour, with successive increases built in that would hike it up to $10.50 by 2008. In the years following, despite litigation from opponents of a living wage, the courts rejected challenges to the law, and public support for the change remained high— notwithstanding doom and gloom prognostications from the town’s tourism-dominated service industries. Santa Fe’s living wage was, Coss averred, “basic economic fairness in making the economy work for everyone and not just the people at the top.” When the chamber of commerce ran candidates against the four councillors most outspoken in their support of the living wage, the chamber’s candidates were all soundly beaten on Election Day.

In a town with a high percentage of practicing Catholics, the living wage in Santa Fe was pushed not just as a sensible economic move— as a way to stimulate spending and savings cycles along the bottom edge of the labor market— but as a moral imperative, reinforced by the authority of papal encyclicals dating back to Leo XIII at the tail end of the nineteenth century. “No one who works full time should have to live in poverty,” Monsignor Jerome Martinez stated. The monsignor was a middle-aged man with a shock of curly gray hair, a warm smile, and a deeply suntanned, slightly pocked face. He shared his cluttered office in an annex to the spectacular Cathedral of St. Francis with two large green cacti and several oil paintings of Jesus. “The dignity of the worker is more than just being a cog in the industrial machine. The Just Wage provides sustenance, housing, minimum health care, retirement benefits, and that the worker should have an opportunity to be generous. The ability to be generous is an important aspect of the church. It makes you feel more like a human being.” Smiling broadly, Martinez proudly recalled that, at a time when living-wage advocates dreamt of the $8.50 earnings floor, the church in Santa Fe paid none of its sixty-five employees less than $11.50 per hour.

Santa Fe’s move followed that of dozens of other municipalities in the decade since Baltimore kick-started the process in 1994. By the turn of the century, over sixty cities had followed Baltimore’s lead. And, in the years following, dozens more enacted such laws. In some cases, the living wage affected only city workers or businesses that contracted with city and state governments; elsewhere, they applied across the board. Yet, despite the movement’s progress, it remained marginal, enforced in a few scores of cities but not adopted by even one state. California’s statewide minimum wage, the highest in the country, was $8 an hour in 2008, still far short of what living-wage advocates claimed was needed to stabilize the lives of low-income workers. And in much of the country, a federal minimum wage prevailed. It was set at $5.15 an hour in 1997 and stayed at that level for ten years, its real value reduced by almost half, leaving recipients with less purchasing power than minimum-wage earners had had at any point in the previous half century. A new Democratic congressional majority finally passed a three-step minimum-wage increase in 2007; yet the increase envisaged only a $7.25 minimum wage by 2009, and it wasn’t inflation indexed. Consequently, the federal minimum wage remained a woefully inadequate method of fighting poverty.

That the minimum wage became so diluted hinted at profound changes within the nation’s political culture. In 1938, Franklin Roosevelt signed the minimum wage into law, calling for a “fair day’s pay for a fair day’s work” and declaring that goods produced in workplaces that did not pay a minimum wage “should be regarded as contraband.” Seventy years on, the minimum wage had lost close to half its real value and was seen as a political punching bag, attacked by conservative critics as impeding the workings of the free market.

By the early twenty-first century, reformers questing after Roosevelt’s vision had come to accept that any minimum wage passed at the federal level was likely to be inadequate to meet the needs of its recipients; instead, they opted to push for local and state living-wage ordinances.

The living-wage movement, however, has had only limited impact. While many states enacted a higher minimum wage than that mandated by the federal government in the years since 1997, none implemented one that genuinely met living-wage criteria. As a result, low-end wages continued to stagnate in a process exacerbated by the systemic underestimation of inflation, which allowed employers to minimize the pay raises they gave to employees. Thus, in a period of unprecedented corporate profits and rising worker productivity— up 2.5 percent per year during the 2000s— most working Americans experienced either stagnant real income or a fall in real income during the Bush presidency. Census Bureau numbers showed that the median household income for working-age households fell, in 2007 dollars, by $2,010 in the years from 2000 to 2007, the only economic cycle on record in which real income for American workers has fallen. For racial minorities, the trend was even worse: median income for blacks declined by over 5 percent during these years; for Hispanics the decline was 3.1 percent.

At the same time, the percentage of Americans, many of them employed, living below the poverty line steadily rose. In the absence of strong wage-protection laws, many employers continued to grievously underpay their employees. Indeed, Bob Pollin came up with a disturbing estimate of the extent of this problem: by the end of the Bush presidency, fully one in three American workers was earning below his living-wage benchmark.

These were the people— described by Princeton University sociologist Katherine Newman as “the missing class”— most impacted by soaring gas and food costs, people who in the best of times spent a higher proportion of their incomes on basic necessities than did any other part of the population. They were deemed by the government too affluent to qualify for food stamps, Medicaid, and the other welfare programs that collectively constituted the country’s frayed safety net. And yet, once oil prices doubled and then doubled again, once the cost of a gallon of milk, a dozen eggs, a pound of rice ballooned, these men, women, and children were the ones left most exposed to destitution. By trying to keep their jobs, low-wage earners and their families were in many ways rendering themselves worse off than those who never had, or couldn’t keep, paid employment and who therefore qualified for the maximum food stamp allotment and various other government subsidies.

Barack Obama campaigned on a promise to raise the minimum wage to $9.50 per hour by 2011; if he makes good on this promise as president, and indexes that minimum wage to inflation, America would finally come close to Roosevelt’s dream of a minimum wage that provided genuine economic security. Given the severity of the financial crisis and subsequent recession, however, it is more than likely this goal will continue to be a promise deferred. For now, at least, local living-wage ordinances and laws targeting the wages of public sector employees and workers for mega-companies like Wal-Mart continue to offer the best hope for creating a safety net for America’s most vulnerable workers.

Click here to buy a copy of Breadline USA

See more stories tagged with: hunger, living wage, american poverty, food security

Sasha Abramsky is the author of Conned: How Millions Went to Prison, Lost the Vote, and Helped Send George W. Bush to the White House (The New Press, 2006).

Nessun commento:

Posta un commento

Post in evidenza

The Great Taking - The Movie

David Webb exposes the system Central Bankers have in place to take everything from everyone Webb takes us on a 50-year journey of how the C...