Article 43


Tuesday, September 30, 2014

New Evidence of How Unemployment Wrecks Families

New Evidence of How Unemployment Wrecks Families

By Eric Planin
The Fiscal Times
September 29, 2014

Much has been written on the plight of the unemployed in terms of lost income, diminished self-esteem and depression and stress on families.

Just last week, Rutgers Universitys JOHN J. HELRICH CENTER FOR WORKFORCE DEVELOPMENT published its latest findings on the consequences of long-term unemployment, which provides a dreary picture of a highly troubled and disaffected slice of U.S. society ғdevastated by the Great Recession.

Less is known about the effects of unemployment on family stability and its short-term and long-term consequences for children. While there are numerous socio-economic theories on what contributes to the breakdown or disintegration of the family unit, unemployment ranks as one of the leading causes of family instability.

A recent STUIDY by researchers at the Urban Institute tells the story: families become far more unstable - and prone to dissolution—when one parent loses a job. The research by Stephan Lindner and Elizabeth Peters tracks the effects in the first year of a job loss for families with children under the age of ten.

The study distinguishes between five different family arrangements from the perspective of a child:  married parents; unmarried biological parents who live together; mothers living with a partner who is not the biological parent of the child; single mothers and single fathers. 

“Our results suggest that children who experience an unemployment event in their families are also more likely to see a destabilizing change in family arrangements in subsequent months, irrespectively of the initial family arrangement,” the study stated.

The most striking finding in families with married parents was the risk of DIVORCE more than doubles when a parent loses his or her job, Lindner wrote in his blog.

Unemployment is also extremely bad for children of single mothers who have little education, the researchers found. “These children are at a higher risk of not living with their mother during the year following her job loss compared with children with single mothers who are employed,” Lindner explained. This is particularly true for single mothers who have no high school degree.

The chart below from MetroTrends illustrates the adverse impact of job loss on family stability.


The researchers say that new living arrangements precipitated by the loss of a job can be detrimental to a child’s development.

It’s no surprise that the loss of a job can be devastating to marriages and households in the aftermath of a prolonged recession that has changed many Americans’ views of marriage—for the worse.

A child who has lived through their parents divorce has long-lasting scars. If the scars are because of financial insecurity, they can influence that child’s life decisions for years to come. That could be one reason why many younger Americans are delaying marriage until theyre financially stable.

According to a new study released last week by the Pew Research Center, the share of AMERICAN ADULTS WHO HAVE NEVER BEEN MARRIED is at “an historic high,” after years of a declining marriage rate.

In 2012, one in five adults ages 25 and older had never been married, according to the Pew analysis of census date. By contrast, only about one in ten people in that age bracket had never been married.

“Adults are marrying later in life, and the shares of adults cohabiting and raising children outside of marriage have increased significantly,” the report stated.

Job security ranks among the highest CONCERNS OF UNMARRIED ADULTS , according to the study. In describing what they were LOOKING FOR IN A SPOUSE, 78 percent of unmarried women and 46 percent of unmarried men said “a steady job.”

Such relationships appear highly vulnerable to the effects of joblessness - including the breakup of families unable to deal with the stress.


Hat Tip: Eduardo Felix

Posted by Elvis on 09/30/14 •
Section Dying America
View (0) comment(s) or add a new one
Printable viewLink to this article

Monday, September 29, 2014

Can’t Find A Qualified US Worker Redux 4


The “Skills Gap” Is a Convenient Myth

By Toni Gilpin
Labor Notes
February 14, 2014

Haven’t seen too many “Help Wanted” signs lately? You haven’t been looking hard enough. At factories across the country, thousands of good jobs are going begging.

If that doesn’t sound quite right to you, take it up with the National Association of Manufacturers. NAM and other industry groups insist at least 600,000 FACORY POSITIONS REMAIN OPEN.

These vacancies are supposed to be the result of a “skills gap” - a shortage of workers with the right stuff for today’s high-tech factories. The gap looms large in high-level discussions of what ails the American economyand it drives much public policy.

"America wants a country that builds things,” SAYS CATERPILLAR CEO DOUG OBERMAN, industry’s leading skills gap spokesman (and board chair of the NAM), “but we have a problem. We dont have the people we need.”

“Politicians of both parties echo this refrain. Businesses cannot find workers with the right skills,” SAYS DEMOCRATIC SENATOR RICK DURBIN, and REPUBLICAN SENATOR BOB PORTMAN AGREES: “Let’s close the skills gap and get Americans working again.”

PRESIDENT OBAMA TOO, MAINTAINS that America’s “manufacturers cannot find enough workers with the proper skills.”

Such bipartisan agreement is reflected in budget priorities. RETRAINING TOO IS A TOUCHSTONE FOR THE OBAMA WHITE HOUSE, and since the president took office more than 18 BILLION FEDERAL DOLLARS have gone to job training programs. Republican Governor Scott Walker of Wisconsin recently committed $8.5 million TO TRAINING.

Although unemployment remains high, the political focus has shifted away from creating new jobs. Instead its on retooling our education system to align with the skilled positions said to be already out there.

Just one hitch: there’s little evidence a “skills gap” exists.


A HOST OF ACADEMIC STUDIES have DEBUNKED THE NOTION - but you don’t need a Ph.D. to figure it out. You just need to recognize the law of supply and demand.

“It’s hard not to break out laughing,” ONE ECONOMIST NOTED recently. “If there’s a skills shortage, there has to be rises in wages [for skilled workers]. It’s basic economics.”

Yet wages in manufacturing - even for skilled workers - are STAGNANT AT BEST.

Peter Cappelli, professor of management at the Wharton School of Business, hears frequent complaints from MANUFACTURERS CLAIMING THEY CAN’T FIND ENOUGH MACHINISTS. “Yet,” CAPPELLI NOTES, “the pay for those positions has dropped 20 percent in real terms over the past 20 years, while skill requirements for many of those jobs have indeed risen.”

Studies from ILLINOIS and WISCONSIN on welding jobs - where employers often cite shortages of available workers - demonstrate that welders wages, as well, have decreased over the past decade, and there are thousands more unemployed welders looking for work than there are projected openings.

When skilled slots do go unfilled, it’s because EMPLOYERS SEEK HIGH-VALUE WORKERS AT DISCOUNT RATES.

“We’ve probably all seen the TV shows where new homebuyers go out to look for a new house,” CAPPELLI SAYS, “and they always are shocked to discover they cannot get what they wanted at the price they want to pay. The real estate agent never concludes the problem is a housing shortage. The buyers have to learn either to pay more or expect less. Is that happening with employers? It does not appear to be.”

When pressed, ONE MANUFACTURING CEO ACKNOWLEDGED that for him, “the skills gap meant an inability to find enough highly qualified applicants, with no union-type experience, willing to start at $10 an hour.”

THAT’S NOT A SKILLS MISMATCH OR EVEN A LABOR SHORTAGE PROBLEM in any meaningful sense,” Marc Levine, professor of history and economic development at the University of Wisconsin/Milwaukee, makes clear. “That’s an EFFORT to secure cheap and docile labor."</b>

“National data on wages, hours, the ‘job gap’ (the ratio of job seekers to available openings), and the skills requirements of projected job openings reveal no evidence of a skills mismatch in national labor markets,” LEVINE SAYS.


In fact, the real deficit we face is a jobs gap. There are still many more unemployed Americans, across every sector of our economy, than there are positions to put them in. “Unemployment is high,” one analyst notes, “not because workers lack the right education or skills, but because employers have not seen demand for their goods and services pick up enough to need to significantly ramp up hiring.”

It is not the right workers we are lacking, it is work.

TRAINING DOESN’T CREATE JOBS,” says Anthony Carnevale, director of Georgetown University’s Center on Education and the Workforce. “Jobs create training. And people get that backwards all the time.”

Economist PAUL KURGMAN STATES BLUNTLY that “claims of a skills gap provide cover for those powerful forces [that] are ideologically opposed to the whole idea of government action on a sufficient scale to jump-start the economy.”


Cat CEO Oberhelman CASTIGATED THE COUNTRY’S “FAILING” SCHOOLS for not turning out fully employable products” and FAULTS AMERICANS not pursuing the rewarding careers he says are available in today’s factories.

STORIES LIKE THIS ONE, from a Wisconsin professor, don’t make it into Oberhelman’s script:

Take my former student, John. He did everything we ask young workers to do, earning two journeyman cards while working and attending Milwaukee Area Technical College full-time.

John left Briggs when it began moving jobs to low-wage states and Mexico. But his new employer, Rockwell, began outsourcing to nonunion, low-wage plants even before it eliminated all hourly workers last year.

So John started over again at Harley-Davidson. But, a year and a half ago, Harley laid John off.

CEOs like Oberhelman create the hype about a skills gap and then use it to duck responsibility for the joblessness they are responsible for.

The blame and the costs are offloaded onto workers, obliged to bankroll their own training, or onto taxpayers, as public schools and community colleges scramble to make their graduates more employable.


It’s hypocritical, to put it mildly, for employers to bemoan the shortage of skilled labor while they lay off workers (including skilled ones) and pay less to those they retain. But their whining deflects attention from record profits and lavish executive compensation.

A recent example comes courtesy of BOEING CEO JIM McNERNEY HAS SAD the U.S. faces an acute “competitive gap” brought on by “insufficient numbers of capable workers.”

Nonetheless, BOEING recently threatened its highly skilled (and unionized) workforce in Everett, Washington, that the company would move its new 777X plane out of state if workers didnt take concessions. THEY GAVE IN.

Capable workers were not Boeing’s goal. Cheap and compliant ones are what the company was after. Reflect for a moment about which sort of people you prefer to build the airplanes you travel in.

So, while the fictional skills gap provides a distraction useful to CEOs and politicians, workers (and taxpayers) should keep focused on what matters most: our EVER-RISING LEVEL OF INCOME INEQUALITY.

That’s the gap that needs minding.

Who Foots the Bill?

While employers bemoan a skills gap, they’re not putting up their own money to close it. Just the opposite. Manufacturers provide far less on-the-job training than they once did.

APPRENTICESHIPS - which oblige employers to assume the lions share of training costs - have fallen 40 percent since 2008. The decline of America’s machine-tool industry, for instance, can be attributed to the collapse of the apprenticeship system.

There’s just one time when companies do eagerly foot the bill for job training: when it serves to undermine the position of union labor.

Last summer, anticipating a possible strike, Caterpillar placed 25 of its non-union employees into the welding program at a Milwaukee community college. Protests by the Steelworkers, who represent workers at Cats South Milwaukee plant, were brushed off.

Just before contract negotiations began, Cat laid off some 300 Milwaukee workers - including skilled welders.

Cat’s hardball tactics resulted in a six-year agreement with frozen pay and way lower wages for new hires.


Employers have also used state-funded training programs to ensure that workers with the wrong kind of experienceגthat is, a union backgroundare kept out of their plants.

In Georgia, taxpayer dollars were used to build a training center for the plant where the Kia Optima is built. Instruction there is provided through the state’s Quick Start program, designed to meet the demand for skilled manufacturing workers.

Jobseekers at the non-union Kia plant are required to go through the centers pre-employment process, and nearly all of Kia’s more than 3,000 employees were trained in robotics, welding, and electronics.

In the process, though, workers already skilled in exactly those areas - members of the United Auto Workers - were evidently weeded out.

When Kia began production in 2010, not one of its employees came from among the pool of thousands of experienced auto workers, all UAW members, whod lost their jobs when Georgia’s GM and Ford plants closed a few years earlier.

A group of UAW members sued to obtain records on the states involvement in Kia’s hiring practices, but their request was rejected by the Georgia Supreme Court.



Why Employers Are to Blame for the Skills Gapђ

By Rob Garver,
The Fiscal Times
August 19, 2014

Complaints about a “skills gap” that make it difficult for employers to fill open positions have become commonplace in discussions about the economy and unemployment levels. Workers, the story goes, simply don’t have the educational background or professional training for the kinds of jobs that exist in today’s knowledge economy.

The argument certainly feels like it makes sense - things have changed an awful lot in the past decade, and it could be that older workers simply don’t have the necessary skills for employment today.

The trouble is that economists have become increasingly skeptical about the skills gap narrative, not least of all because of the absence of real wage inflation. After all, if skilled workers were in high demand but short supply, the laws of economics suggest they would be able to demand, and get, higher wages.

A new PAPER by PETER CAPPELLI, a professor at the Wharton Schools Center for Human Resources, should help solve the puzzle of the skills gap. In a comprehensive survey of the literature on the subject, Cappelli reports little hard evidence to support the theory. He notes that when it comes to workers’ skills, the most pervasive problem in the U.S. right now is that many individuals are working jobs for which they are overqualified.

He suggests that what is really driving the discussion about worker skills is a combination of employers seeking to hold down payroll costs by keeping wages as low as possible - and a longer-term effort to transfer responsibility for training workers from employers themselves to the taxpayer.

“The evidence driving the complaints about skills does not necessarily appear where labor market experts might expect to see it, such as in rising wages,” Cappelli writes. “Instead, it comes directly from employers - typically from surveys - who report difficulties hiring the kind of workers they need. The assertions explaining their reported difficulties center on the idea that the academic achievement of high school [graduates] is inadequate or that there are not enough college graduates in practical fields like computer science and ENGINEERING. The recommendations from these reports include increased immigration and use of foreign workers as well as efforts to shape the majors that college students choose.

Numerous economists have noted that when employers raise wages, skilled employees suddenly become easier to find - and Cappelli notes that much of the discussion about a skills gap appears to be driven by employers looking to hire workers on the cheap.

More telling, though, is that Cappelli, who is also the author of the book Why Good People Cant Find Jobs, notes a disinclination among employers to train existing workers; he says they look instead to hire individuals who already possess a specific skill set. In many cases, he finds, the business community is pushing the public sector to provide the sort of training that workers used to receive through apprentice programs, professional development programs and other on-the-job training.

“The view that emerges from these arguments is one where responsibility for developing the skills that employers want is transferred from the employer onto job seekers and schools,” he writes. “Such a transfer of responsibility would be profound in its implications.”

“While increased training programs could reduce businesses’costs,” Cappelli notes, “the end result is likely to be a less efficient system in which key job-related skills are necessarily left out.”

“Schools, at least as traditionally envisioned, are not suited to organize work experience, the key attribute that employers want,” he writes. “Nor are they necessarily good at teaching work-based skills. Those skills are easiest and cheapest to learn in the workplace through APPRENTICE-LIKE ARRANGEMENTS that one finds not only in skilled trades but also in fields like accounting and medicine.”

“Unlike in the classroom,” he continues, “problems to practice on do not have to be created in the workplace. They exist already, and solving them creates value for others. Observation and practice is also easiest to do where the productive work is being done, and employment creates incentives and motivation that typical classrooms cannot duplicate.”

Cappelli closes with a message for the research community. “The myth of the skills gap,” he says, “only exists because, in the absence of hard data on the issue, advocates of a particular position find it easy to make claims that are simply assertions and claims that even casual acquaintance with real evidence would indicate are false.”



In May of 2013 we had 16,944,480 STEM jobs in America.

2,107,070 of them are Software jobs in America.

If a person were to study the LCA Applications for temporary workers to be brought in under a H-1B visa, you quickly realize that 65% of them are for these software jobs in America.

During the years from 2001 to 2013 we issued anywhere between a low of 339,243 to a high of 494,565 of these H-1B visas.

The H-1B visa currently is capped at 65,000 for the regular visa and an additional 20,000 for advanced educated visa holders.

The difference between the visas issued and the 85,000 visa cap is the amount of visa renewals each year.

This means that we had a low of 220,508 to a high of 321,467 renewals for Software Jobs alone if we multiply the visas issued by 65%

Keep in mind that there were only 2,107,070 of these Software Jobs in America AND these renewals are for periods of three years which could easily account for a million jobs or more.

As you can see, Americans are slowly getting forced out of the software portion of the STEM industry.

And the ones I have heard from are saying something along these lines:

Most people, even family, just don’t get it.  This is not like unemployment in the past.  In my early 20s, I tried every sort of job for around 4 years or so.  I could always get another job maybe not a great one - but I was never out of work for any long stretches.  Thats what’s different now you can’t get another job and get back control of your own life no matter what you do.  Some employed people may be sympathetic to a degree, but they really dont see it (and maybe don’t want to see it) for what it really is the end of working and making a decent living, and the end of financial independence.  And you’re NOT going to get a job at Mac Donalds or Target or the local supermarket because the Latinos etc already have it.  You’re NOT going to join the young baristas at Starbucks or the waiters and waitresses at Chili’s because all the college kids and recent graduates who cant get good jobs anywhere else are already working there.  The low-end, low-pay jobs are all taken and the professional jobs are being given to cheaper foreign workers.

Is high tech’s current treatment of American STEM professionals just a pilot project for some future nation-wide screwing of nearly all American professional men and women?  I dont know, but should anyone really be surprised if it is?  There’s only way that most people will wake up when it happens to them when they lose their jobs and their incomes and lifestyles.  Then they’ll be shocked and outraged and demand that something be done.  But as long as its just 20-25 million of their fellow Americans including hundreds of thousands of displaced STEM professionals, who cares?  What’s that old saying?  When they came for all the others, I didnt care because I didn’t know any of them, but by the time they came for me there was nobody left to care.

Many of you will be in denial and say this is not happening.

Those of us that have been through it know for a fact that it is happening.

Many of you may be immigrants, or the sons and daughters of immigrants.

I too am of German descent.

I will always believe we need to be a country where people can immigrate to America and become and American if that is what they want to do.

Just like your family or my family did.

And I will always believe that they should be able to apply for any job that their skills allow them to strive for just as I believe Americans should be allowed to do the same.


Bringing in temporary workers to displace Americans in America where Americans are being forced to train their replacements is not immigration.

It simply is the very same tactics that were used by corporations to break union picket lines by using scabs.

Which leaves us in a position where we can be silent and slowly but surely be forced out of our high paying jobs by temporary workers brought in for less money and find that there are no similar or better paying jobs to be had.

Or we can band together and begin to educate our fellow citizens via radio and tv ads so that their children, and our children will still have the opportunity to strive for the moon if that is their goal.

My thoughts on how to do this are simple, and I֒m willing to step aside and join myself if anybody has a better plan:

1.Annual dues of $20.00

2.The money will be used to air radio and tv ads and to provide subsistence level work for our unemployed STEM workers until we can find them work again.

By that, I mean that we would hire them at $600 per week and use their skills to do the research and development that is necessary to build our knowledge base and educate our fellow citizens and to contact our employers, both large and small to help educate them as to what is happening and why it is bad for the future of America, and most importantly, work with these employers to get our members back to work so that they can provide for their families.

I am but one man.

My resources are limited, and I cant get this message in front of 16 million STEM workers, but all of us working together can do exactly that.

It really is that simple.

Are you wondering if you are considered a STEM worker?

You might be surprised that you are, and you can verify it by clicking on THIS LINK.

United We Stand

Divided We Fall

I want to get back to work and stay at work as an American in America.

Do you?


Posted by Elvis on 09/29/14 •
Section Dying America
View (0) comment(s) or add a new one
Printable viewLink to this article

Saturday, September 27, 2014

Professors On Food Stamps


The shocking true story of academia in 2014
Forget minimum wage, some adjunct professors say they’re making 50 cents an hour. Wait till you read these stories

By Matt Saccaro
September 24, 2014

You’ve probably heard the old stereotypes about professors in their ivory tower lecturing about Kafka while clad in a tweed jacket. But for many professors today, THE REALITY is QUITE DIFFERENT: being so poorly paid and treated, that they’re more likely to be found bargain-hunting at day-old bread stores. This is academia in 2014.

"The most shocking thing is that many of us don’t even earn the federal minimum wage,” said Miranda Merklein, an adjunct professor from Santa Fe who started teaching in 2008. “Our students didn’t know that professors with PhDs aren’t even earning as much as an entry-level fast food worker. We’re not calling for the $15 minimum wage. We don’t even make minimum wage. And we have no benefits and no job security.”

Over three quarters of college professors are adjunct. Legally, adjunct positions are part-time, at-will employment. Universities pay adjunct professors by the course, anywhere between $1,000 to $5,000. So if a professor teaches three courses in both the fall and spring semesters at a rate of $3,000 per course, they’ll make $18,000 dollars. The average full-time barista makes the same yearly wage. However, a full-time adjunct works more than 40 hours a week. Theyre not paid for most of those hours.

“If its a three credit course, you’re paid for your time in the classroom only,” said Merklein. “So everything else you do is by donation. If you hold office hours, those you’re doing for free. Your grading you do for free. Anything we do with the student where we sit down and explain what happened when the student was absent, thats also free labor. Some would call it wage theft because these are things we have to do in order to keep our jobs. We have to do things we’re not getting paid for. Its not optional.”

Merklein was far from the only professor with this problem.

“It can be a tremendous amount of work,” said Alex Kudera. Kudera started teaching in 1996 and is the author of a novel about adjunct professorship, Fight For Your Long Day. “When I was an adjunct, I didn’t have a social life. Its basically just work all the time. You plan your weekend around the fact that you’re going to be doing work Saturday and Sunday - typically grading papers, which is emotionally exhausting. The grading can be tedious but at least it’s a private thing. Its basically 5-10 hours a day for every day of the week.”

One professor from Indiana who spoke to Salon preferred to remain anonymous. “At some point early in my adjunct career, I broke down my pay hourly. I figured out that I was making under minimum wage and then I stopped thinking about it,” he said. “I can’t speak for everyone, but I essentially design my own courses. And sometimes I don’t find out how many courses I’m going to be teaching until maybe Thursday and they start Monday.  So I have to develop a course, and it’s been the case where one summer I taught English 102 where the course was literally dropped in my lap three days before it started and I had to develop it entirely from scratch. It didn’t even have a text book. That was three 16-hour days in a row developing a syllabus. You’re expected to be in contact with students constantly. You have to be available to them all the time. You’re expected to respond to emails generally within 24 hours. I’m always on-call. And it’s one of my favorite parts of my job, I don’t regret it, but if you factored those on-call hours in, that’d be the end of it. I’d be making 50 cents an hour.”

Being financially secure and teaching at an institute of higher education are almost mutually exclusive, even among professors who are able to teach the maximum amount of courses each semester. Thus, more than half of adjunct professors in the United States seek a second job. Not all professors can find additional employment. An advanced degree slams most doors shut and opens a handful by the narrowest crack.

Nathaniel Oliver taught as an adjunct for four years in Alabama. He received $12,000 a year during his time teaching.

You fall in this trap where you may be working for less than you would be at a place that pays minimum wage yet you canӒt get the minimum wage jobs because of your education, Oliver said.

AcademiaԒs tower might be ivory but it casts an obsidian shadow. Oliver was one of many professors trapped in the oxymoronic life of pedantic destitution. Some professors in his situation became homeless. Oliver was fortunateӔ enough to only require food stamps, a fact of life for many adjuncts.

ItӒs completely insane, he said. ԓAnd this isnt happening just to me. More and more people are doing it.Ҕ

We have food stamps,Ӕ said the anonymous adjunct from Indiana. We wouldnӒt be able to survive without them.

ԓMany professors are on food stamps and they go to food donation centers. They donate plasma. And thats a pretty regular occurrence,Ҕ Merklein told Salon.

Life isnt much easier for those lucky enough to find another income stream. Many are reduced to menial service jobs and other forms of first-world deprivation.

ғI ended up applying for a job in a donut shop recently, said an Ohio professor who requested to go by a pseudonym. Professor Doe taught for over two decades. Many years he only made $9600. Resorting to a food service job was the only way he could afford to live, but it came with more than its expected share of humiliation.

ԓOne of the managers there is one of the students I had a year ago who was one of the very worst writers Ive ever had. What are we really saying here? WhatҒs going on in the work world? Something does not seem quite right. Im not asking to be rich. IҒm not asking to be famous. I just want to pay my bills.

Life became even more harrowing for adjuncts after the Affordable Care Act when universities slashed hours and health insurance coverage became even more difficult to obtain.

ԓTheyre no better off than people who work at Walmart,Ҕ said Gordon Haber, a 15-year adjunct professor and author of Adjunctivitis.Ӕ

Perhaps not surprisingly, other professors echoed this sentiment.

ThereӒs this idea that faculty are cheap, renewable labor. Theres the idea that student are customers or clients,Ҕ said Joseph Fruscione, a former adjunct of 15 years. And there are some cases where if a student is displeased with a grade, thereӒs the notion where theyre paying for this, so they deserve an A or a B because of all this tuition.Ҕ

The Walmart metaphor is vivid,Ӕ Kudera said. There are these random schools where theyӒre just being terrible. But as some of the schools it seems like theres some enlightened schools and it doesnҒt seem like every single person who speaks up loses their classes. It varies school to school. Theyre well aware some of their adjuncts may not afford toothpaste at the end of the month or whatever those kinds of tragedies may be.Ҕ He suggested looking at the hashtag #badmin to see transgressions and complaints documented in real time.

Robert Baum, a former adjunct and now a dean, was able to provide insights from both sides of the problem.

That pressure [to make money] has been on higher education forever,Ӕ he said. A lot of the time when I was an adjunct, things were very black and what IӒm finding is that the graying is happening a lot. Im losing track of the black and white.Ҕ Still, Baum noted that the current system was hardly ideal, and that change was necessary. The Walmart model is based on the idea of putting the burden on taking care of the worker on either the state or on the workerӒs credit card or on the workers family. And that is no different than what IҒve experienced across my adjunct life. No different. Zero difference.

Ana Fores Tamayo, an adjunct who claims she was blacklisted over her activism, agreed with the latter parts of BaumԒs assessment.

Walmart and the compartmentalized way of treating faculty is the going rate. The way administration turns around and says, for instance, where I was teaching it was probably about 65% adjunct faculty. But the way they fix their numbers, it makes it looks as if itӒs less when they show their books because the way they divide it and the way they play with their numbers it shows that its less.Ҕ

“As soon as they hear about you organizing, they go on the defensive,” Merklein said. “For instance, at my community college, I am being intimidated constantly and threatened in various ways, hypothetically usually. They don’t like to say something thats an outright direct threat. They get really freaked out when they see pamphlets around the adjunct faculty office and everyones wearing buttons regardless of what professional organization or union it is. They will then go on the offensive. They will usually contact their attorney who is there to protect the school as a business and to act in an anti-labor capacity.”

The most telling phrase in Merkleins words are “the school as a business.” Colleges across the country have transitioned from bastions of intellectual enlightenment to resort hotels prizing amenities above academics. Case in point: The ludicrously extravagant gyms in America’s larger universities are home to rock climbing walls, corkscrew tracks, rooftop gardens, and a lazy river. Schools have billions to invest in housing and other on-campus projects. Schools have millions (or in some cases mere hundreds of thousands) to pay administrators.  Yet schools can’t find the money to hire more full-time professors. If one follows the money, it’s clear that colleges view education as tertiary. The rigor of a universitys courses doesn’t attract the awe of doe-eyed high school seniors. Lavish dorms and other luxuries do.

Despite such execrable circumstances, professors trek onward and try to educate students as best they can. But how good can education provided by overworked, underpaid adjuncts be? The professors Salon spoke to had varying opinions.

Benay Blend has taught for over 30 years. For 10 of those years, she worked in a bookstore for $7.50 an hour because she needed the extra income.

“I don’t want to fall into the trap that the media use that using adjunct labor means poor education,” Blend said. “I have a PhD. I’ve published probably more than full-time people where I teach. I’ve been teaching for 30 years. Im a good teacher.”

“On the whole, teaching quality by adjuncts is excellent,” said Kane Faucher, a six-year adjunct. “But many are not available for mentoring and consultation because they have to string together so many courses just to reach or possibly exceed the poverty line. This means our resources are stretched too thinly as a matter of financial survival, and there are many adjuncts who do not even have access to a proper office, which means they work out of coffee shops and cars.”

The anonymous adjunct professor from Indiana expressed a similar sentiment.

“I definitely don’t want to go down the road of Adjunct professors, because of the way we’re handled, are not able to be effective teachers. I think some of us are more effective teachers than people who get paid a lot more than we do. Some of us aren’t for really good reasons which have to do with not having the resources. I mean if youre working at three different colleges, how can you possibly be there?”

Ann Kottner, an adjunct professor and activist, agreed.

“The real problem with the adjunct market right now is that it cheats students of the really outstanding educations they should be getting,” she said. “They’re paying a lot of money for these educations and they’re not getting them. And it’s not because they have bad instructors, its because their instructors are not supported to do the kind of work they can do.”

The situation reached such a flashpoint that Kottner and several colleagues (some of which spoke to Salon for this article) penned a petition to the US Department of Labors Wage and Hour Division. The petition calls for “an investigation into the labor practices of our colleges and universities in the employment of contingent faculty.” Ana Foryes Tamayo has a petition as well, this one to the US Secretary of Education, Arne Duncan. They both have over 8,000 signatories.

When asked about the petition’s impact, Kottner said it was “just one tactic in the whole sheath of a rising adjunct response to contingency.” Other tools included unionization, which is difficult in many states. Kottner said the most powerful force was information. “I think our biggest weapon now is basically making the public aware of what their tuition dollars are not paying for, and that is professor salaries and professor security.”

When asked if there was any hope about the future, no consensus was reached among the adjuncts Salon spoke with. Some believed things would never change. Others thought the tide would turn if enough people knew how far the professoriat had fallen.


Posted by Elvis on 09/27/14 •
Section Dying America
View (0) comment(s) or add a new one
Printable viewLink to this article

Monday, September 22, 2014

Chomsky On America

Fhe U.S. Behaves Nothing Like a Democracy, But You’ll Never Hear About It in Our Free Press

By Norm Chomsky
August 15, 2014

In a powerful speech, Chomsky lays out how the majority of US policies are the opposite of what wide swaths of the public want.

I’d like to comment on topics that I think should regularly be on the front pages but are not - and in many crucial cases are scarcely mentioned at all or are presented in ways that seem to me deceptive because they’re framed almost reflexively in terms of doctrines of the powerful.

In these comments I’ll focus primarily on the United States for several reasons: One, it’s the most important country in terms of its power and influence. Second, it’s the most advanced - not in its inherent character, but in the sense that because of its power, other societies tend to move in that direction. The third reason is just that I know it better. But I think what I say generalizes much more widely - at least to my knowledge, obviously there are some variations. So I’ll be concerned then with tendencies in American society and what they portend for the world, given American power.

American power is diminishing, as it has been in fact since its peak in 1945, but it’s still incomparable. And it’s dangerous. Obama’s remarkable global terror campaign and the limited, pathetic reaction to it in the West is one shocking example. And it is a campaign of international terrorism - by far the most extreme in the world. Those who harbor any doubts on that should read the report issued by Stanford University and New York University, and actually I’ll return to even more serious examples than international terrorism.

According to received doctrine, we live in capitalist democracies, which are the best possible system, despite some flaws. There’s been an interesting debate over the years about the relation between capitalism and democracy, for example, are they even compatible? I won’t be pursuing this because I’d like to discuss a different system - what we could call the “really existing capitalist democracy”, RECD for short, pronounced “wrecked” by accident. To begin with, how does RECD compare with democracy? Well that depends on what we mean by “democracy”. There are several versions of this. One, there is a kind of received version. It’s soaring rhetoric of the Obama variety, patriotic speeches, what children are taught in school, and so on. In the U.S. version, it’s government “of, by and for the people”. And it’s quite easy to compare that with RECD.

In the United States, one of the main topics of academic political science is the study of attitudes and policy and their correlation. The study of attitudes is reasonably easy in the United States: heavily-polled society, pretty serious and accurate polls, and policy you can see, and you can compare them. And the results are interesting. In the work that’s essentially the gold standard in the field, it’s concluded that for roughly 70% of the population - the lower 70% on the wealth/income scale - they have no influence on policy whatsoever. They’re effectively disenfranchised. As you move up the wealth/income ladder, you get a little bit more influence on policy. When you get to the top, which is maybe a tenth of one percent, people essentially get what they want, i.e. they determine the policy. So the proper term for that is not democracy; it’s plutocracy.

Inquiries of this kind turn out to be dangerous stuff because they can tell people too much about the nature of the society in which they live. So fortunately, Congress has banned funding for them, so we won’t have to worry about them in the future.

These characteristics of RECD show up all the time. So the major domestic issue in the United States for the public is jobs. Polls show that very clearly. For the very wealthy and the financial institutions, the major issue is the deficit. Well, what about policy? There’s now a sequester in the United States, a sharp cutback in funds. Is that because of jobs or is it because of the deficit? Well, the deficit.

Europe, incidentally, is much worse - so outlandish that even The Wall Street Journalhas been appalled by the disappearance of democracy in Europe. ...[I]t had an article [this year] which concluded that “the French, the Spanish, the Irish, the Dutch, Portuguese, Greeks, Slovenians, Slovakians and Cypriots have to varying degrees voted against the currency bloc’s economic model since the crisis began three years ago. Yet economic policies have changed little in response to one electoral defeat after another. The left has replaced the right; the right has ousted the left. Even the center-right trounced Communists (in Cyprus) - but the economic policies have essentially remained the same: governments will continue to cut spending and raise taxes.” It doesn’t matter what people think and “national governments must follow macro-economic directives set by the European Commission”. Elections are close to meaningless, very much as in Third World countries that are ruled by the international financial institutions. That’s what Europe has chosen to become. It doesn’t have to.

Returning to the United States, where the situation is not quite that bad, there’s the same disparity between public opinion and policy on a very wide range of issues. Take for example the issue of minimum wage. The one view is that the minimum wage ought to be indexed to the cost of living and high enough to prevent falling below the poverty line. Eighty percent of the public support that and forty percent of the wealthy. What’s the minimum wage? Going down, way below these levels. It’s the same with laws that facilitate union activity: strongly supported by the public; opposed by the very wealthy - disappearing. The same is true on national healthcare. The U.S., as you may know, has a health system which is an international scandal, it has twice the per capita costs of other OECD countries and relatively poor outcomes. The only privatized, pretty much unregulated system. The public doesn’t like it. They’ve been calling for national healthcare, public options, for years, but the financial institutions think it’s fine, so it stays: stasis. In fact, if the United States had a healthcare system like comparable countries there wouldn’t be any deficit. The famous deficit would be erased, which doesn’t matter that much anyway.

One of the most interesting cases has to do with taxes. For 35 years there have been polls on ‘what do you think taxes ought to be?’ Large majorities have held that the corporations and the wealthy should pay higher taxes. They’ve steadily been going down through this period.

On and on, the policy throughout is almost the opposite of public opinion, which is a typical property of RECD.

In the past, the United States has sometimes, kind of sardonically, been described as a one-party state: the business party with two factions called Democrats and Republicans. That’s no longer true. It’s still a one-party state, the business party. But it only has one faction. The faction is moderate Republicans, who are now called Democrats. There are virtually no moderate Republicans in what’s called the Republican Party and virtually no liberal Democrats in what’s called the Democratic [sic] Party. It’s basically a party of what would be moderate Republicans and similarly, Richard Nixon would be way at the left of the political spectrum today. Eisenhower would be in outer space.

There is still something called the Republican Party, but it long ago abandoned any pretence of being a normal parliamentary party. It’s in lock-step service to the very rich and the corporate sector and has a catechism that everyone has to chant in unison, kind of like the old Communist Party. The distinguished conservative commentator, one of the most respected - Norman Ornstein - describes today’s Republican Party as, in his words, “a radical insurgency - ideologically extreme, scornful of facts and compromise, dismissive of its political opposition” - a serious danger to the society, as he points out.

In short, Really Existing Capitalist Democracy is very remote from the soaring rhetoric about democracy. But there is another version of democracy. Actually it’s the standard doctrine of progressive, contemporary democratic theory. So I’ll give some illustrative quotes from leading figures - incidentally not figures on the right. These are all good Woodrow Wilson-FDR-Kennedy liberals, mainstream ones in fact. So according to this version of democracy, “the public are ignorant and meddlesome outsiders. They have to be put in their place. Decisions must be in the hands of an intelligent minority of responsible men, who have to be protected from the trampling and roar of the bewildered herd”. The herd has a function, as it’s called. They’re supposed to lend their weight every few years, to a choice among the responsible men. But apart from that, their function is to be “spectators, not participants in action” - and it’s for their own good. Because as the founder of liberal political science pointed out, we should not succumb to democratic dogmatisms about people being the best judges of their own interest”. They’re not. We’re the best judges, so it would be irresponsible to let them make choices just as it would be irresponsible to let a three-year-old run into the street. Attitudes and opinions therefore have to be controlled for the benefit of those you are controlling. It’s necessary to “regiment their minds”. It’s necessary also to discipline the institutions responsible for the “indoctrination of the young.” All quotes, incidentally. And if we can do this, we might be able to get back to the good old days when ӓTruman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers.” This is all from icons of the liberal establishment, the leading progressive democratic theorists. Some of you may recognize some of the quotes.

The roots of these attitudes go back quite far. They go back to the first stirrings of modern democracy. The first were in England in the 17th Century. As you know, later in the United States. And they persist in fundamental ways. The first democratic revolution was England in the 1640s. There was a civil war between king and parliament. But the gentry, the people who called themselves “the men of best quality”, were appalled by the rising popular forces that were beginning to appear on the public arena. They didn’t want to support either king or parliament. Quote their pamphlets, they didn’t want to be ruled by “knights and gentlemen,” who do but oppress us, but we want to be governed by countrymen like ourselves, who know the people’s sores”. That’s a pretty terrifying sight. Now the rabble has been a pretty terrifying sight ever since. Actually it was long before. It remained so a century after the British democratic revolution. The founders of the American republic had pretty much the same view about the rabble. So they determined that “power must be in the hands of the wealth of the nation, the more responsible set of men. Those who have sympathy for property owners and their rights”, and of course for slave owners at the time. In general, men who understand that a fundamental task of government is to protect the minority of the opulent from the majority”. Those are quotes from James Madison, the main framer - this was in the Constitutional Convention, which is much more revealing than the Federalist Papers which people read. The Federalist Papers were basically a propaganda effort to try to get the public to go along with the system. But the debates in the Constitutional Convention are much more revealing. And in fact the constitutional system was created on that basis. I don’t have time to go through it, but it basically adhered to the principle which was enunciated simply by John Jay, the president of the ӭ Continental Congress, then first Chief Justice of the Supreme Court, and as he put it, “those who own the country ought to govern it”. That’s the primary doctrine of RECD to the present.

There’ve been many popular struggles since - and they’ve won many victories. The masters, however, do not relent. The more freedom is won, the more intense are the efforts to redirect the society to a proper course. And the 20th Century progressive democratic theory that I’ve just sampled is not very different from the RECD that has been achieved, apart from the question of: Which responsible men should rule? Should it be bankers or intellectual elites? Or for that matter should it be the Central Committee in a different version of similar doctrines?

Well, another important feature of RECD is that the public must be kept in the dark about what is happening to them. The “herd” must remain “bewildered”. The reasons were explained lucidly by the professor of the science of government at Harvard - that’s the official name - another respected liberal figure, Samuel Huntington. As he pointed out, “power remains strong when it remains in the dark. Exposed to sunlight, it begins to evaporate”. Bradley Manning is facing a life in prison for failure to comprehend this scientific principle. Now Edward Snowden as well. And it works pretty well. If you take a look at polls, it reveals how well it works. So for example, recent polls pretty consistently reveal that Republicans are preferred to Democrats on most issues and crucially on the issues in which the public opposes the policies of the Republicans and favors the policies of the Democrats. One striking example of this is that majorities say that they favor the Republicans on tax policy, while the same majorities oppose those policies. This runs across the board. This is even true of the far right, the Tea Party types. This goes along with an astonishing level of contempt for government. Favorable opinions about Congress are literally in the single digits. The rest of the government as well. It’s all declining sharply.

Results such as these, which are pretty consistent, illustrate demoralization of the public of a kind that’s unusual, although there are examples - the late Weimar Republic comes to mind. The tasks of ensuring that the rabble keep to their function as bewildered spectators, takes many forms. The simplest form is simply to restrict entry into the political system. Iran just had an election, as you know. And it was rightly criticized on the grounds that even to participate, you had to be vetted by the guardian council of clerics. In the United States, you don’t have to be vetted by clerics, but rather you have to be vetted by concentrations of private capital. Unless you pass their filter, you don’t enter the political system - with very rare exceptions.

There are many mechanisms, too familiar to review, but that’s not safe enough either. There are major institutions that are specifically dedicated to undermining authentic democracy. One of them is called the public relations industry. A huge industry, it was in fact developed on the principle that it’s necessary to regiment the minds of men, much as an army regiments its soldiers - I was actually quoting from one of its leading figures before.

The role of the PR industry in elections is explicitly to undermine the school-child version of democracy. What you learn in school is that democracies are based on informed voters making rational decisions. All you have to do is take a look at an electoral campaign run by the PR industry and see that the purpose is to create uninformed voters who will make irrational decisions. For the PR industry that’s a very easy transition from their primary function. Their primary function is commercial advertising. Commercial advertising is designed to undermine markets. If you took an economics course you learned that markets are based on informed consumers making rational choices. If you turn on the TV set, you see that ads are designed to create irrational, uninformed consumers making irrational choices. The whole purpose is to undermine markets in the technical sense.

They’re well aware of it, incidentally. So for example, after Obama’s election in 2008, a couple of months later the advertising industry had its annual conference. Every year they award a prize for the best marketing campaign of the year. That year they awarded it to Obama. He beat out Apple computer, did an even better job of deluding the public - or his PR agents did. If you want to hear some of it, turn on the television today and listen to the soaring rhetoric at the G-8 Summit in Belfast. It’s standard.

There was interesting commentary on this in the business press, primarily The London Financial Times, which had a long article, interviewing executives about what they thought about the election. And they were quite euphoric about this. They said this gives them a new model for how to delude the public. The Obama model could replace the Reagan model, which worked pretty well for a while.

Turning to the economy, the core of the economy today is financial institutions. They’ve vastly expanded since the 1970s, along with a parallel development - accelerated shift of production abroad. There have also been critical changes in the character of financial institutions.

If you go back to the 1960s, banks were banks. If you had some money, you put it in the bank to lend it to somebody to buy a house or start a business, or whatever. Now that’s a very marginal aspect of financial institutions today. They’re mostly devoted to intricate, exotic manipulations with markets. And they’re huge. In the United States, financial institutions, big banks mostly, had 40% of corporate profit in 2007. That was on the eve of the financial crisis, for which they were largely responsible. After the crisis, a number of professional economists - Nobel laureate Robert Solow, Harvard’s Benjamin Friedman - wrote articles in which they pointed out that economists haven’t done much study of the impact of the financial institutions on the economy. Which is kind of remarkable, considering its scale. But after the crisis they took a look and they both concluded that probably the impact of the financial institutions on the economy is negative. Actually there are some who are much more outspoken than that. The most respected financial correspondent in the English-speaking world is Martin Wolf of the Financial Times. He writes that the “out-of-control financial sector is eating out the modern market economy from the inside, just as the larva of the spider wasp eats out the host in which it has been laid”. By “the market economy” he means the productive economy.

There’s a recent issue of the main business weekly, Bloomberg Business Week, which reported a study of the IMF that found that the largest banks make no profit. What they earn, according to the IMF analysis, traces to the government insurance policy, the so-called too-big-to-fail policy. There is a widely publicized bailout, but that’s the least of it. There’s a whole series of other devices by which the government insurance policy aids the big banks: cheap credit and many other things. And according to the IMF at least, that’s the totality of their profit. The editors of the journal say this is crucial to understanding why the big banks present such a threat to the global economy - and to the people of the country, of course.

After the crash, there was the first serious attention by professional economists to what’s called systemic risk. They knew it existed but it wasn’t much a topic of investigation. ‘Systemic risk’ means the risk that if a transaction fails, the whole system may collapse. That’s what’s called an externality in economic theory. It’s a footnote. And its one of the fundamental flaws of market systems, a well-known, inherent flaw, is externalities. Every transaction has impacts on others which just aren’t taken into account in a market transaction. Systemic risk is a big one. And there are much more serious illustrations than that. I’ll come back to it.

What about the productive economy under RECD? Here there’s a mantra too. The mantra is based on entrepreneurial initiative and consumer choice in a free market. There are agreements established called free-trade agreements, which are based on the mantra. That’s all mythology.

The reality is that there is massive state intervention in the productive economy and the free-trade agreements are anything but free-trade agreements. That should be obvious. Just to take one example: The information technology (IT) revolution, which is driving the economy, that was based on decades of work in effectively the state sector - hard, costly, creative work substantially in the state sector, no consumer choice at all, there was entrepreneurial initiative but it was largely limited to getting government grants or bailouts or procurement. Except by some economists, that’s underestimated but a very significant factor in corporate profit. If you can’t sell something, hand it over the government. They’ll buy it.

After a long period - decades in fact - of hard, creative work, the primary research and development, the results are handed over to private enterprise for commercialization and profit. That’s Steve Jobs and Bill Gates and so on. It’s not quite that simple of course. But thatҒs a core part of the picture. The system goes way back to the origins of industrial economies, but it’s dramatically true since WWII that this ought to be the core of the study of the productive economy.

Another central aspect of RECD is concentration of capital. In just the past 20 years in the United States, the share of profits of the two hundred largest enterprises has very sharply risen, probably the impact of the Internet, it seems. These tendencies towards oligopoly also undermine the mantra, of course. Interesting topics but I won’t pursue them any further.

Instead, I’d like to turn to another question. What are the prospects for the future under RECD? There’s an answer. They’re pretty grim. It’s no secret that there are a number of dark shadows that hover over every topic that we discuss and there are two that are particularly ominous, so I’ll keep to those, though there are others. One is environmental catastrophe. The other is nuclear war. Both of which of course threaten the prospects for decent survival and not in the remote future.

I won’t say very much about the first, environmental catastrophe. That should be obvious. Certainly the scale of the danger should be obvious to anyone with eyes open, anyone who is literate, particularly those who read scientific journals. Every issue of a technical journal virtually has more dire warnings than the last one.

There are various reactions to this around the world. There are some who seek to act decisively to prevent possible catastrophe. At the other extreme, major efforts are underway to accelerate the danger. Leading the effort to intensify the likely disaster is the richest and most powerful country in world history, with incomparable advantages and the most prominent example of RECD - the one that others are striving towards.

Leading the efforts to preserve conditions in which our immediate descendants might have a decent life, are the so-called “primitive” societies: First Nations in Canada, Aboriginal societies in Australia, tribal societies and others like them. The countries that have large and influential indigenous populations are well in the lead in the effort to “defend the Earth”. That’s their phrase. The countries that have driven indigenous populations to extinction or extreme marginalization are racing forward enthusiastically towards destruction. This is one of the major features of contemporary history. One of those things that ought to be on front pages. So take Ecuador, which has a large indigenous population. It’s seeking aid from the rich countries to allow it to keep its substantial hydrocarbon reserves underground, which is where they ought to be. Now meanwhile, the U.S. and Canada are enthusiastically seeking to burn every drop of fossil fuel, including the most dangerous kind - Canadian tar sands - and to do so as quickly and fully as possible - without a side glance on what the world might look like after this extravagant commitment to self-destruction. Actually, every issue of the daily papers suffices to illustrate this lunacy. And lunacy is the right word for it. It’s exactly the opposite of what rationality would demand, unless it’s the skewed rationality of RECD.

Well, there have been massive corporate campaigns to implant and safeguard the lunacy. But despite them, there’s still a real problem in American society. The public is still too committed to scientific rationality. One of the many divergences between policy and opinion is that the American public is close to the global norm in concern about the environment and calling for actions to prevent the catastrophe and that’s a pretty high level. Meanwhile, bipartisan policy is dedicated to ‘bringing it on’, in a phrase that George W. Bush made famous in the case of Iraq. Fortunately, the corporate sector is riding to the rescue to deal with this problem. There is a corporate funded organization - the American Legislative Exchange Council (ALEC). It designs legislation for states. No need to comment on what kind of legislation. They’ve got a lot of clout and money behind them. So the programs tend to get instituted. Right now they’re instituting a new program to try to overcome the excessive rationality of the public. It’s a program of instruction for K-12 (kindergarten to 12th grade in schools). Its publicity says that the idea is to improve critical faculties - I’d certainly be in favor of that - by balanced teaching. ‘Balanced teaching’ means that if a sixth grade class learned something about what’s happening to the climate, they have to be presented with material on climate change denial so that they have balanced teaching and can develop their critical faculties. Maybe that’ll help overcome the failure of massive corporate propaganda campaigns to make the population ignorant and irrational enough to safeguard short-term profit for the rich. Its pointedly the goal and several states have already accepted it.

Well, it’s worth remembering, without pursuing it that these are deep-seated institutional properties of RECD. They’re not easy to uproot. All of this is apart from the institutional necessity to maximize short-term profit while ignoring an externality that’s vastly more serious even than systemic risk. For systemic risk, the market failure - the culprits - can run to the powerful nanny state that they foster with cap in hand and they’ll be bailed out, as we’ve just observed again and will in the future. In the case of destruction of the environment, the conditions for decent existence, there’s no guardian angel around - nobody to run to with cap in hand. For that reason alone, the prospects for decent survival under RECD are quite dim.

Let’s turn to the other shadow: nuclear war. It’s a threat that’s been with us for 70 years. It still is. In some ways it’s growing. One of the reasons for it is that under RECD, the rights and needs of the general population are a minor matter. That extends to security. There is another prevailing mantra, particularly in the academic professions, claiming that governments seek to protect national security. Anyone who has studied international relations theory has heard that. That’s mostly mythology. The governments seek to extend power and domination and to benefit their primary domestic constituencies - in the U.S., primarily the corporate sector. The consequence is that security does not have a high priority. We see that all the time. Right now in fact. Take say Obama’s operation to murder Osama Bin Laden, prime suspect for the 9/11 attack. Obama made an important speech on national security last May 23rd. It was widely covered. There was one crucial paragraph in the speech that was ignored in the coverage. Obama hailed the operation, took pride in it - an operation which incidentally is another step at dismantling the foundations of Anglo-American law, back to the Magna Carta, namely the presumption of innocence. But that’s by now so familiar, it’s not even necessary to talk about it. But there’s more to it. Obama did hail the operation but he added to it that it “cannot be the norm”. The reason is that “the risks were immense”. The Navy SEALswho carried out the murder might have been embroiled in an extended firefight, but even though by luck that didn’t happen, “the cost to our relationship with Pakistan - and the backlash of the Pakistani public over the encroachment on their territory”, the aggression in other words, “was so severe that we’re just now beginning to rebuild this important partnership”.

It’s more than that. Let’s add a couple of details. The SEALswere under orders to fight their way out if they were apprehended. They would not have been left to their fate if they had been, in Obama’s words, been “embroiled in an extended firefight”. The full force of the U.S. military would have been employed to extricate them. Pakistan has a powerful military. It’s well-trained, highly protective of state sovereignty. Of course, it has nuclear weapons. And leading Pakistani specialists on nuclear policy and issues are quite concerned by the exposure of the nuclear weapons system to jihadi elements. It could have escalated to a nuclear war. And in fact it came pretty close. While the SEALswere still inside the Bin Laden compound, the Pakistani chief of staff, General Kayani, was informed of the invasion and he ordered his staff in his words to “confront any unidentified aircraft”. He assumed it was probably coming from India. Meanwhile in Kabul, General David Petraeus, head of the Central Command, ordered “U.S. warplanes to respond if Pakistanis scrambled their fighter jets”. It was that close. Going back to Obama, “by luck” it didn’t happen. But the risk was faced without noticeable concern, without even reporting in fact.

There’s a lot more to say about that operation and its immense cost to Pakistan, but instead of that let’s look more closely at the concern for security more generally. Beginning with security from terror, and then turning to the much more important question of security from instant destruction by nuclear weapons.

As I mentioned, Obama’s now conducting the world’s greatest international terrorist campaign - the drones and special forces campaign. It’s also a terror-generating campaign. The common understanding at the highest level [is] that these actions generate potential terrorists. I’ll quote General Stanley McChrystal, Petraeus’ predecessor. He says that “for every innocent person you kill”, and there are plenty of them, “you create ten new enemies”.

Take the marathon bombing in Boston a couple of months ago, that you all read about. You probably didn’t read about the fact that two days after the marathon bombing there was a drone bombing in Yemen. Usually we don’t happen to hear much about drone bombings. They just go on - just straight terror operations which the media aren’t interested in because we don’t care about international terrorism as long as the victims are somebody else. But this one we happened to know about by accident. There was a young man from the village that was attacked who was in the United States and he happened to testify before Congress. He testified about it. He said that for several years, the jihadi elements in Yemen had been trying to turn the village against Americans, get them to hate Americans. But the villagers didn’t accept it because the only thing they knew about the United States was what he told them. And he liked the United States. So he was telling them it was a great place. So the jihadi efforts didn’t work. Then he said one drone attack has turned the entire village into people who hate America and want to destroy it. They killed a man who everybody knew and they could have easily apprehended if they’d wanted. But in our international terror campaigns we don’t worry about that and we don’t worry about security.

One of the striking examples was the invasion of Iraq. U.S. and British intelligence agencies informed their governments that the invasion of Iraq was likely to lead to an increase in terrorism. They didn’t care. In fact, it did. Terrorism increased by a factor of seven the first year after the Iraqi invasion, according to government statistics. Right now the government is defending the massive surveillance operation. That’s on the front pages. The defense is on grounds that we have to do it to apprehend terrorists.

If there were a free press - an authentic free press - the headlines would be ridiculing this claim on the grounds that policy is designed in such a way that it amplifies the terrorist risk. But you can’t find that, which is one of innumerable indications of how far we are from anything that might be called a free press.

Let’s turn to the more serious problem: instant destruction by nuclear weapons. That’s never been a high concern for state authorities. There are many striking examples. Actually, we know a lot about it because the United States is an unusually free and open society and there’s plenty of internal documents that are released. So we can find out about it if we like.

Let’s go back to 1950. In 1950, U.S. security was just overwhelming. There’d never been anything like it in human history. There was one potential danger: ICBMs with hydrogen bomb warheads. They didn’t exist, but they were going to exist sooner or later. The Russians knew that they were way behind in military technology. They offered the U.S. a treaty to ban the development of ICBMs with hydrogen bomb warheads. That would have been a terrific contribution to U.S. security. There is one major history of nuclear weapons policy written by McGeorge Bundy, National Security Advisor for Kennedy and Johnson. In his study he has a couple of casual sentences on this. He said that he was unable to find even a staff paper discussing this. Here’s a possibility to save the country from total disaster and there wasn’t even a paper discussing it. No one cared. Forget it, we’ll go on to the important things.

A couple of years later, in 1952, Stalin made a public offer, which was pretty remarkable, to permit unification of Germany with internationally supervised free elections, in which the Communists would certainly lose, on one condition - that Germany be demilitarized. That’s hardly a minor issue for the Russians. Germany alone had practically destroyed them several times in the century. Germany militarized and part of a hostile Western alliance is a major threat. That was the offer.

The offer was public. It also of course would have led to an end to the official reason for NATO. It was dismissed with ridicule. Couldn’t be true. There were a few people who took it seriously - James Warburg, a respected international commentator, but he was just dismissed with ridicule. Today, scholars are looking back at it, especially with the Russian archives opening up. And they’re discovering that in fact it was apparently serious. But nobody could pay attention to it because it didn’t accord with policy imperatives - vast production of threat of war.

Let’s go on a couple of years to the late ‘50s, when Khrushchev took over. He realized that Russia was way behind economically and that it could not compete with the United States in military technology and hope to carry out economic development, which he was hoping to do. So he offered a sharp mutual cutback in offensive weapons. The Eisenhower administration kind of dismissed it. The Kennedy administration listened. They considered the possibility and they rejected it. Khrushchev went on to introduce a sharp unilateral reduction of offensive weapons. The Kennedy administration observed that and decided to expand offensive military capacity - not just reject it, but expand it. It was already way ahead.

That was one reason why Khrushchev placed missiles in Cuba in 1962 to try to redress the balance slightly. That led to what historian Arthur Schlesinger - Kennedy’s advisor - called “the most dangerous moment in world historyҔ - the Cuban missile crisis. Actually there was another reason for it: the Kennedy administration was carrying out a major terrorist operation against Cuba. Massive terrorism. It’s the kind of terrorism that the West doesn’t care about because somebody else is the victim. So it didn’t get reported, but it was large-scale. Furthermore, the terror operation - it was called Operation Mongoose - had a plan. It was to culminate in an American invasion in October 1962. The Russians and the Cubans may not have known all the details, but it’s likely that they knew this much. That was another reason for placing defensive missiles in Cuba.

Then came very tense weeks as you know. They culminated on October 26th. At that time, B-52s armed with nuclear weapons were ready to attack Moscow. The military instructions permitted crews to launch nuclear war without central control. It was decentralized command. Kennedy himself was leaning towards military action to eliminate the missiles from Cuba. His own, subjective estimate of the probability of nuclear war was between a third and a half. That would essentially have wiped out the Northern Hemisphere, according to President Eisenhower.

At that point, on October 26th, the letter came from Khrushchev to Kennedy offering to end the crisis. How? By withdrawal of Russian missiles from Cuba in return for withdrawal of U.S. missiles from Turkey. Kennedy in fact didn’t even know there were missiles in Turkey. But he was informed of that by his advisors. One of the reasons he didn’t know is that they were obsolete and they were being withdrawn anyway. They were being replaced with far more lethal invulnerable Polaris submarines. So that was the offer: the Russians withdraw missiles from Cuba; the U.S. publicly withdraw obsolete missiles that it’s already withdrawing from Turkey, which of course are a much greater threat to Russia than the missiles were in Cuba.

Kennedy refused. That’s probably the most horrendous decision in human history, in my opinion. He was taking a huge risk of destroying the world in order to establish a principle: the principle is that we have the right to threaten anyone with destruction anyway we like, but it’s a unilateral right. And no one may threaten us, even to try to deter a planned invasion. Much worse than this is the lesson that has been taken away - that Kennedy is praised for his cool courage under pressure. That’s the standard version today.

The threats continued. Ten years later, Henry Kissinger called a nuclear alert. 1973. The purpose was to warn the Russians not to intervene in the Israel-Arab conflict. What had happened was that Russia and the United States had agreed to institute a ceasefire. But Kissinger had privately informed Israel that they didn’t have to pay any attention to it; they could keep going. Kissinger didn’t want the Russians to interfere so he called a nuclear alert.

Going on ten years, Ronald Reagan’s in office. His administration decided to probe Russian defenses by simulating air and naval attacks - air attacks into Russia and naval attacks on its border. Naturally this caused considerable alarm in Russia, which unlike the United States is quite vulnerable and had repeatedly been invaded and virtually destroyed. That led to a major war scare in 1983. We have newly released archives that tell us how dangerous it was - much more dangerous than historians had assumed. There’s a current CIA study that just came out. It’s entitled “The War Scare Was for Real”. It was close to nuclear war. It concludes that U.S. intelligence underestimated the threat of a Russian preventative strike, nuclear strike, fearing that the U.S. was attacking them. The most recent issue of The Journal of Strategic Studies - one of the main journals - writes that this almost became a prelude to a preventative nuclear strike. And it continues. I won’t go through details, but the Bin Laden assassination is a recent one.

There are now three new threats. I’ll try to be brief, but let me mention three cases that are on the front pages right now. North Korea, Iran, China. They’re worth looking at. North Korea has been issuing wild, dangerous threats. That’s attributed to the lunacy of their leaders. It could be argued that it’s the most dangerous, craziest government in the world, and the worst government. It’s probably true. But if we want to reduce the threats instead of march blindly in unison, there are a few things to consider. One of them is that the current crisis began with U.S.-South Korean war games, which included for the first time ever a simulation of a preemptive attack in an all-out war scenario against North Korea. Part of these exercises were simulated nuclear bombings on the borders of North Korea. That brings up some memories for the North Korean leadership. For example, they can remember that 60 years ago there was a superpower that virtually leveled the entire country and when there was nothing left to bomb, the United States turned to bombing dams. Some of you may recall that you could get the death penalty for that at Nuremberg. It’s a war crime. Even if Western intellectuals and the media choose to ignore the documents, the North Korean leadership can read public documents, the official Air Force reports of the time, which are worth reading. I encourage you to read them. They exulted over the glorious sight of massive floods “that scooped clear 27 miles of valley below”, devastated 75% of the controlled water supply for North Korea’s rice production, sent the commissars scurrying to the press and radio centers to blare to the world the most severe, hate-filled harangues to come from the Communist propaganda mill in the three years of warfare. To the communists, the smashing of the dams meant primarily the destruction of their chief sustenance: rice. Westerners can little conceive the awesome meaning which the loss of this staple food commodity has for Asians: starvation and slow death. Hence the show of rage, the flare of violent tempers and the threats of reprisals when bombs fell on five irrigation dams. Mostly quotes. Like other potential targets, the crazed North Korean leaders can also read high-level documents which are public, declassified, which outline U.S. strategic doctrine. One of the most important is a study by Clinton’s strategic command, STRATCOM. It’s about the role of nuclear weapons in the post-Cold War era. Its central conclusions are: U.S. must retain the right of first strike, even against non-nuclear states; furthermore, nuclear weapons must always be available, at the ready, because they “cast a shadow over any crisis or conflict”. They frighten adversaries. So they’re constantly being used, just as if you’re using a gun, going into a store pointing a gun at the store owner. You don’t fire it, but you’re using the gun. STRATCOM goes on to say planners should not be too rational in determining what the opponent values the most. All of it has to be targeted. It hurts to portray ourselves as too fully rational and cool-headed. That the United States may become irrational and vindictive if its vital interests are attacked should be part of the national persona that we project.Ӕ It’s beneficial for our strategic posture “if some elements appear to be potentially out-of-control”. That’s not Richard Nixon or George W. Bush; it’s Bill Clinton.

Again, Western intellectuals and media choose not to look, but potential targets don’t have that luxury. There’s also a recent history that the North Korean leaders know quite well. I’m not going to review it because of lack of time. But it’s very revealing. I’ll just quote mainstream U.S. scholarship. North Korea has been playing tit for tat - reciprocating whenever Washington cooperates, retaliating whenever Washington reneges. Undoubtedly it’s a horrible place. But the record does suggest directions that would reduce the threat of war if that were the intention, certainly not military maneuvers and simulated nuclear bombing.

Let me turn to the “gravest threat to world peace” - those are Obama’s words, dutifully repeated in the press: Iran’s nuclear program. It raises a couple of questions: Who thinks it’s the gravest threat? What is the threat? How can you deal with it, whatever it is?

‘Who thinks it’s a threat?’ is easy to answer. It’s a Western obsession. The U.S. and its allies say it’s the gravest threat and not the rest of the world, not the non-aligned countries, not the Arab states. The Arab populations don’t like Iran but they don’t regard it as much of a threat. They regard the U.S. as the threat. In Iraq and Egypt, for example, the U.S. is regarded as the major threat they face. It’s not hard to understand why.

What is the threat? We know the answer from the highest level: the U.S. intelligence and the Pentagon provide estimates to Congress every year. You can read them. The Global Security Analysis - they of course review this. And they say the main threat of a Iranian nuclear program - if they’re developing weapons, they don’t know. But they say if they’re developing weapons, they would be part of their deterrent strategy. The U.S. can’t accept that. A state that claims the right to use force and violence anywhere and whenever it wants, cannot accept a deterrent. So they’re a threat. That’s the threat.

So how do you deal with the threat, whatever it is? Actually, there are ways. I’m short of time so I won’t go through details but there’s one very striking one: We’ve just passed an opportunity last December. There was to be an international conference under the auspices of the non-proliferation treaty, UN auspices, in Helsinki to deal with moves to establish a nuclear weapons-free zone in the Middle East. That has overwhelming international support - non-aligned countries; it’s been led by the Arab states, Egypt particularly, for decades. Overwhelming support. If it could be carried forward it would certainly mitigate the threat. It might eliminate it. Everyone was waiting to see whether Iran would agree to attend.

In early November, Iran agreed to attend. A couple of days later, Obama canceled the conference. No conference. The European Parliament passed a resolution calling for it to continue. The Arab states said they were going to proceed anyway, but it can’t be done. So we have to live with the gravest threat to world peace. And we possibly have to march on to war which in fact is being predicted.

The population could do something about it if they knew anything about it. But here, the free press enters. In the United States there has literally not been a single word about this anywhere near the mainstream. You can tell me about Europe.

The last potential confrontation is China. It’s an interesting one, but time is short so I won’t go on.

The last comment I’d like to make goes in a somewhat different direction. I mentioned the Magna Carta. That’s the foundations of modern law. We will soon be commemorating the 800th anniversary. We won’t be celebrating it - more likely interring what little is left of its bones after the flesh has been picked off by Bush and Obama and their colleagues in Europe. And Europe is involved clearly.

But there is another part of Magna Carta which has been forgotten. It had two components. The one is the Charter of Liberties which is being dismantled. The other was called the Charter of the Forests. That called for protection of the commons from the depredations of authority. This is England of course. The commons were the traditional source of sustenance, of food and fuel and welfare as well. They were nurtured and sustained for centuries by traditional societies collectively. They have been steadily dismantled under the capitalist principle that everything has to be privately owned, which brought with it the perverse doctrine of - what is called the tragedy of the commons - a doctrine which holds that collective possessions will be despoiled so therefore everything has to be privately owned. The merest glance at the world shows that the opposite is true. It’s privatization that is destroying the commons. That’s why the indigenous populations of the world are in the lead in trying to save Magna Carta from final destruction by its inheritors. And they’re joined by others. Take say the demonstrators in Gezi Park in trying to block the bulldozers in Taksim Square. They’re trying to save the last part of the commons in Istanbul from the wrecking ball of commercial destruction. This is a kind of a microcosm of the general defense of the commons. It’s one part of a global uprising against the violent neo-liberal assault on the population of the world. Europe is suffering severely from it right now. The uprisings have registered some major successes. The most dramatic are Latin America. In this millennium it has largely freed itself from the lethal grip of Western domination for the first time in 500 years. Other things are happening too. The general picture is pretty grim, I think. But there are shafts of light. As always through history, there are two trajectories. One leads towards oppression and destruction. The other leads towards freedom and justice. And as always - to adapt Martin Luther King’s famous phrase - there are ways to bend the arc of the moral universe towards justice and freedom - and by now even towards survival.

Noam Chomsky is a professor of linguistics and philosophy at MIT.


Posted by Elvis on 09/22/14 •
Section Revelations
View (0) comment(s) or add a new one
Printable viewLink to this article

Saturday, September 13, 2014

Rise Of The Temp Workers Part 7

Its Part-Time Work or No Work for Millions of Americans

By Sienna Beard
Wall Street Cheat
September 8, 2014

Part-time employment seems to be a growing trend, with more people working part-time in the United States, but also in other countries. 13.5 PERCENT OF US EMPLOYEES WERE PART TIMERS IN 1968, but that number reached 20.1 percent in January, and remains close to that number now.

Part-time work can be beneficial for employees who want flexibility, want to be at home with kids part-time, or want a second job. Workers are often forced to work part-time because their companies cut hours, or they cant find a full-time job. Companies who employ part-time workers see many advantages: Often, they don’t have to pay the workers as much or give them full (or any) benefits. Although there are some more permanent part-time jobs, many jobs are seasonal or temporary, which makes it easier for employers to part ways with the part-time employers when they need or want to.

Although part-time employment has decreased slightly in the U.S. since 2010, the numbers remain high. According to The Washington Post, in June, the number of part-time workers rose by more than one million, to 27 MILLION WORKERS. While we might be out of the recession, many Americans are not finding the jobs that they want, with the hours or pay that they need.

Part-time workers face many challenges, including the possibility of no health care benefits. Although this may not be such a big problem for workers who have a full-time job in addition to their part-time job, for workers who solely depend on their part-time job, the lack of benefits can be difficult. In addition, often part-time workers make less than their full-time equivalents.

They also might be more dispensable to their companies: especially workers who are hired on a temporary assignment, or seasonally, can often be let go easily. Workers who take a temporary job because they need a job, then face the prospect of finding another job much sooner than they might wish. Even part-time workers who feel that they have a fairly steady job, might not have the same responsibilities or say as other workers.

ECONOMISTS HAVE NOTICED, and are worried, about the amount of part-time workers in the job force. According to USA Today, if the economy is getting better, we should be seeing workers spending more hours at work, but this ISN’T THE CASE for part-timers. Experts also wonder if the AFFORDABLE CARE ACT is affecting how many part-time EMPLOYEES COMPANIES are hiring, but it’s difficult to tell for sure. Hiring part-time workers can be a way to cut costs, regardless of HEALTH INSURANCE.

There are many advantages for companies to hire part-time employees. ADVANTAGES INCLUDE the fact that hiring part-time employees can cut costs, and can allow the company to have flexibility in scheduling.

However, companies that hire many PART-TIME WORKERS also face potential disadvantages. Part-time workers may be less committed to their job or the company, which can lead to more turnover. Part-time workers also may have less knowledge about their job or the company because they dont work as often and may not stay at the company for as long. So although companies can save money by hiring part-time workers, they do lose out in other ways.

Full-time work isn’t always the best situation for some workers though; part-time work is ideal for certain workers. Those who already have a full-time job, but want to have a little extra income, or pursue a passion, can benefit from part-time work. Sometimes parents want to work, but they also want to be home with their kids, and creating part-time jobs that are flexible but also challenging and rewarding, can attract very talented and intelligent workers who do not wish to work full-time. Although many part-time jobs are less desirable because they are designed for workers who have less professional training, this doesnt have to be the case.

Part-time work isn’t just a trend in the United States. As of 2011, the average employed person in Austria worked just over twenty-seven hours PER WEEK, people worked an average of just over twenty-five hours per week in Germany, and nearly twenty-eight hours in Belgium. The JOB MARKET IN JAPAN also seems to be favoring more part-time work.

Part-time work may be a trend thats going to develop and continue, and for some people, that isn’t a bad thing. However, for the many workers in America, and across the world, who want to work full-time, and need the benefits, the threat of part-time work steadily increasing is a serious one.


Hat tip: Eduardo Felix

Posted by Elvis on 09/13/14 •
Section Dying America
View (0) comment(s) or add a new one
Printable viewLink to this article
Page 1 of 2 pages  1 2 >


Total page hits 13233791
Page rendered in 4.5036 seconds
42 queries executed
Debug mode is off
Total Entries: 3645
Total Comments: 341
Most Recent Entry: 06/16/2024 08:48 am
Most Recent Comment on: 06/14/2023 06:21 pm
Total Logged in members: 0
Total guests: 11
Total anonymous users: 0
The most visitors ever was 588 on 01/11/2023 03:46 pm

Email Us


Login | Register
Resumes | Members

In memory of the layed off workers of AT&T

Today's Diversion

Constitutional governments and aristocracies are commonly overthrown owing to some deviation from justice …...the rich, if the constitution gives them power, are apt to be insolent and avaricious.… In all well-attempered governments there is nothing which should be more jealously maintained than the spirit of obedience to law, more especially in small matters; for transgression creeps in unperceived and at last ruins the state, just as the constant recurrence of small expenses in time eats up a fortune. - Aristotle


Advanced Search



September 2014
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30        

Most recent entries

Must Read

RSS Feeds

BBC News

ARS Technica

External Links

Elvis Favorites

BLS and FRED Pages


Other Links

All Posts



Creative Commons License

Support Bloggers' Rights