More Connecting the Dots

WANT TO CLOSE THE RICH-POOR COLLEGE-COMPLETION GAP?

DON’T START ON THE FIRST DAY OF COLLEGE,
BUT ON THE FIRST DAY OF KINDERGARTEN

In an updated version of an old joke, four Washington policy types are having dinner together.  During the pause before coffee and dessert, God appears and offers to answer one question from each of them.  “When,” asks the first wonk, “will we have racial harmony in our cities?”  “Not in your lifetime” replies God.  “When will we pay off the national debt?” asks the second wonk.  “Not in your lifetime,” God answers.  “When will there be peace in the Middle East?” inquires the third wonk.  Once again, God replies, “Not in your lifetime.”

For his question, the fourth wonk chooses the one on the cover of Indicators of Higher Education Equity In The United States: 45 Year Trend Report, recently released by the Pell Institute for the Study of Opportunity in Higher Education:  “When will the U.S. close the gap in higher education attainment by family income?” God thinks for a moment, then replies, “Not in my lifetime.”

Well, maybe it won’t take that long.  But nothing in the Pell Institute report affords grounds for much optimism that the gap will be closed or substantially narrowed anytime soon.  For 1970, for example, Pell Institute figures show a 46 percentage point gap between the college continuation rate for the top quartile of families, arranged by income, and the bottom.  Forty-two years later, the gap was 37 points.  A nine-point gain in 42 years: At that rate, the gap should be closed in about 173 years.

Or take bachelor’s degree attainment.  In 1970, a 34 percentage-point gap between the top and bottom quartiles; in 2013, the gap had doubled to 68 points.  But take heart.  With the top quartile’s rate capped at 100 percent, a milestone it should reach, at its rate of increase over the past 43 years, by 2040, the bottom quartile of families, at its rate of increase since 1970, will catch up in 1300 years, by 3313—about a century after the conclusion of the time frame projected by Star Trek: Enterprise.

If anything, the report’s shallow-to-flat-to-negative trend lines actually understate the dilemma for low-income students.  A forty-plus-year time span is one useful way of measuring progress.  But in real life, students confront a 4-6-year timeline, the time it takes them to go through college.  Any other time frame is all but meaningless to their educational progress.

Of course, there’s no law that says that federal and state governments can’t increase their investment in educating low-income students.

Actually, there are such laws, the laws that set levels of government support for state universities and Pell grant scholarships.  So, for example, 29 of the 50 states have decreased their total funding of state colleges and universities, according to the New America Foundation; 44 of 50 decreased per-pupil support. And, as Indicators documents, “the percent of average college costs covered by the maximum Pell Grant declined by 40 percentage points – from a high of 67 percent in 1975 to a low of 27 percent in 2012.”

But those laws could be changed.  Which is what Pell Institute director Margaret Cahalan, this report’s co-author, recommends in her concluding essay,  “Sixteen Strategies for Widening Equity of Participation in Higher Education in the United States: Reflections from International Comparisons”: ”Restor[e] Public Funding at the Federal, State, and Local levels to Earlier Levels Including Restoring Pell Grants to Their Former Buying Power.”

But does anyone think that state legislatures, or a Congress controlled by starve-the-beast conservatives, will restore those billions of dollars of cuts—or make any of Cahalan’s other sensible investments in educating low-income students, most proven in other countries–anytime soon?

That leaves strategies that don’t require such large funding outlays, and Cahalan includes several of those among her “Sixteen Strategies,” things like “Supporting Competency-Mastery Based Learning and Recognition of Prior Learning…,” “Listening to What Students Are Telling Us” and “Taking an Integrated and Holistic Approach to Student Services and Institutional Access Plans.”

They would all help.  But would any or all of them reverse downward trends or accelerate weak upward ones enough to close the gap?  Doesn’t seem likely. Like other societal needs, the prospects for supplying the public resources needed to narrow the college-attainment gap between rich and poor seem dim, at least for the next several election cycles.

To recognize that reality is not a counsel of despair. But it does point to the need for an alternate path to closing the college completion gap.  The road to, if not closing, at least narrowing the college attainment gap may need to start, not when students start college, but when they start kindergarten.

In fact Indicators also recognizes, in a grab-bag section entitled “Other Factors Affecting Equity” the importance of an obstacle to low-income college completion comparable in importance to paying college costs:

Achieving equity in attainment will require eliminating gaps not only in college enrollment, choice, and completion, but also in other critical outcomes, including completion of a rigorous academic curricular program…In order to enroll and succeed in college, all individuals must graduate from high school academically ready for college-level work.

Of course, all individuals are not now graduating from high school college-ready.  In 2013, ACT reports, 45 percent of students from families with incomes under $36,000 were college-ready in reading, compared to 84 percent for families with incomes over $100,000, a gap of 39 percentage points.  In math, the gap is 42 points, in reading 37 points, and science 39 points.  The racial gap is comparable: 28 percent of all white students tested college-ready in all four subject areas; African Americans: 4 percent.

The consequence: While almost twenty percent of all entering freshmen at four-year colleges must take remedial or developmental courses to make up for content they should have been taught in high school, the figure is about 32 percent for low-income students; and 39 percent for African Americans, 20 percent for Hispanic Americans, and just 13.6 percent for white students—for courses for which they pay college tuition but receive no college credit.  Worse yet, fewer than four of ten students who take remedial courses graduate from college in six years; one in four don’t even complete the remedial courses.

But although the lack of financial support for college education bears comparable responsibility with K-12 deficiencies for the completion gap, there is a difference in the two factors’ amenability to improvement: K-12 reform has been fermenting for more than a quarter-century and is still bubbling.  The nation has been awakened to the need to improve the education that low-income students receive before college, and an active debate is going on.  Teach For America, charter schools, Joel Klein, Michelle Rhee, Race To the Top. Education reform is channeling FDR’s dictum:  Try something.   If it fails, try something else.  But above all, try something.  No such zeitgeist, no comparable energy, no potential for real improvement obtains in higher education policy.

Congress could make college readiness, rather than just high school graduation, the objective of federal K-12 policy.  Reauthorization of No Child Left Behind, a/k/a the Elementary and Secondary Education Act (ESEA), appears to be on a glide path toward floor debate in the Senate So, as it happens, is the Higher Education Act.  In a just world, Senate HELP Committee chair Lamar Alexander (R-Tenn) and ranking member Patty Murray (D-Wash) would recognize that K-12 and post-secondary policy are inextricably linked and join the two pieces of legislation together.  That won’t happen.

But even more important than crossing our fingers for Senate action—and more promising than the snowball’s-chance-in-hell of constructive action in the House—is to act on the aspect of the issue that can be affected, by tapping into the existing momentum of K-12 reform.  State by state and city by city, charter schools, standardized testing, the Common Core curriculum and the rest of the education reform agenda will be implemented, refined, and will succeed—or they won’t and will fall by the wayside to be replaced, as FDR advocated, by something else.

Restoring the buying power of Pell Grants, increasing state support for public universities, and the other sensible dollar-sign-bearing recommendations in the Pell Institute report will stay on the back burner. But increasing college completion rates by increasing college readiness is ready to go.

THE PICTURE OF DON DRAPER

In our final glimpse of Don Draper, in the Mad Men series finale, we see him in a white shirt, clean shaven, with the haircut that InStyle described as “ maybe the most defining feature of [Draper actor Jon] Hamm’s character, short on the sides, slightly longer on top.”

Around him in the shot are his fellow participants at an early-‘seventies, Esalen-like New Age retreat, their shirts counter-culture prints, their hair fashionably long.  His erstwhile ad agency colleagues—Roger, Peggy, Pete, Rizzo and the rest—seen in the rest of the finale’s last hour are likewise dressed and coiffed for the ‘seventies.

Not Don.  The world has changed.  His life has changed. He has stolen the name and identity of a fallen military colleague’s.  He has been married and divorced twice and serially betrayed both wives.  He has neglected his work and his business partners.

But he looks as he has always looked over the eleven or so years covered by Mad Men’s seven seasons:  his hair no longer or grayer than when we first met him, his gaze no less direct, his face still unlined—a mid-20th-century Dorian Gray, the title character in Oscar Wilde’s novel, whose youth and beauty never fade, while his portrait changes over time to reflect the moral corruption and decay of the life he has lived.

In addition to the fact that their appearance never changes, Don Draper and Dorian Gray also both have secret identities.  Gray’s identity is split between his unchanging physical self and the portrait that reveals his corruption.*  Draper is haunted by the fear that his family and colleagues will discover that his name and identity are not his own.  Born Dick Whitman, he stole the name Don Draper from an officer who was killed in action in the Korean War.

Perhaps most damning, both Gray and Draper use their unchanging outward perfection to enable their success, to gain, and then betray, the confidence of intimates and associates.  “Once Dorian discovers his godlike powers,” New Yorker critic Alex Ross has written, “he carries out various heinous acts, including murder.”   And a young colleague, dismissed after he tries unsuccessfully to duplicate one of Draper’s more outrageous approaches, tells Don that the only reason the gambit worked for Don was his handsomeness.

The two parallel stories, Dorian Gray’s and Don Draper’s, diverge at the end.  Seeking to reclaim his virtue and conscience, Dorian slashes the picture that reveals him as the degenerate he has become.  Police and servants hear screams of agony.  But when they gain access to the room where the painting was stored, they find, not the ageless, beautiful youth who had entered the room just moments before, but a corpse, “withered and wrinkled and loathsome of visage.”

Happily, Don Draper escapes, or at least postpones, a comparable final reckoning.  After walking away from his ad agency and giving his expensive car away, he seems to flirt with the idea of renouncing, as Dorian did, his malignant and dishonestly acquired identity. Hair askew, wearing a checked shirt, he appears to contemplate suicide.

But he is Don Draper, not Dorian Gray.  He keeps his identity—not just the name but the appearance and the ad man’s persona.  In the final scene of the series, we see him in a crowd of mantra-chanting meditators, clad in his trademark haircut and white shirt, eyes closed.

Slowly a smile lights his face. Could it be a smile of peace with the world and with himself?  Hardly. He has come up with the idea for another ad campaign.  A segue to the iconic Coca Cola “I’d like to buy the world a Coke” ad, combined with our recollection of a phone call with Peggy back at the ad agency—They’d take you back in a minute.  You could have the Coke account—tells us that Don hasn’t changed.   He’s still an ad man.

Dorian Gray dies a horrible death. Don Draper will never grow old.

 

WHO’S TO BLAME FOR HIGH-STAKES-TEST CHEATING SCANDALS?

☐ Teachers who change student answers
☐ Administrators who set “no-excuses” goals
☐ “Experts” who think that setting goals can make up for inadequate preparation

(Choose all that apply)

pencil erasing answer sheetWhat could the Veterans Affairs scandal and the Atlanta public school test cheating scandal have in common? Quite a bit, actually.

Both involve public employees whose jobs committed them to help vulnerable people, could not fulfill those commitments, and tried to cover it up.

In my last Connecting the Dots post, Want to Get to the Root of the VA Scandal? Follow the Money, I suggested that responsibility in the VA debacle belonged not only to the VA functionaries who jiggered appointment records to make it look like veterans were receiving timely attention, but should be shared by those who took the country to war without budgeting adequately “to care for him who shall have borne the battle” (in the words of Lincoln that form the VA’s motto) and those who thought that management systems, goals and incentives could compensate for actual funding for veterans’ health care..

As a recent and riveting New Yorker article, “Wrong Answer,” recounts, the events that led to the Atlanta testing scandal followed a similar pattern, a pattern of cheating that arose from an effort to meet unrealistic goals, driven by a combination of dedication and the desire to keep jobs and earn bonuses.

The scandal has been heavily covered in the Atlanta Journal-Constitution and national media: Public school teachers and administrators previewed sealed tests and changed wrong answers to correct ones. 109 school system employees implicated, including 38 principals and administrators. And a former superintendent, mortally ill with cancer, and her associates await trial as the masterminds of the enterprise.

The New Yorker story is told from the point of view not of the investigators who uncovered the cheating but from the point of view of the teachers and administrators who were faced with what they saw as an impossible task: eliciting high performances on standardized tests from woefully under-prepared middle-schoolers, some of them, according to teacher Damany Lewis “were still reading by sounding out the letters…. A veteran teacher told Lewis that only twenty per cent of his students would grasp what he was teaching.”

The story is well worth reading in its entirety, but a few excerpts convey the way it unfolded as testing-time approached:

Lewis felt that he had pushed [his students] to work harder than they ever had in their lives. “I’m not going to let the state slap them in the face and say they’re failures,” he told me. “I’m going to do everything I can to prevent the why-try spirit.”…

At the end of the testing week, Lewis went back to the testing office with Crystal Draper, a language-arts teacher. For about an hour, they erased wrong answers and bubbled in the right ones…. Many students were on the cusp of passing, and he gave them a little nudge, so that they would pass by one or two points…

A month later, when the scores came back,… a teacher announced, “You did it! You finally made it!” For the first time since the passage of No Child Left Behind, Parks [Middle School] had met its annual goals: “We had heard what everyone was saying: Y’all aren’t good enough [a student told New Yorker writer Rachel Aviv]. “Now we could finally go to school with our heads held high.”

[After a year,] the cheating process began to take the form of a routine…[The school’s reading coordinator] paged up to six teachers and told them to report to the room. While their students were at recess, the teachers erased wrong answers and filled in the right ones. Lewis took photographs of the office with his cell phone so that he could make sure he left every object, even the pencils on Kiel’s desk, exactly as he’d found them…. cheating had become a “well-oiled machine”.

The Atlanta scandal raises a far more complicated set of issues than the VA scandal, many of whose problems could be solved with the money needed to hire more doctors and other staff needed to treat sick and injured vets. Unlike VA employees, who gamed the system to keep their jobs and earn bonuses, the Atlanta teachers in “Wrong Answer” did what they did in what they thought were the best interests of their students as well as themselves. Many children, predominantly from low-income families, are at levels of education far below grade level and need to catch up. Setting goals is a useful way to set expectations and determine whether they’ve been met. Even “stretch” or aspirational goals have their place: mediocre or worse sports teams begin each season and game believing that they can win, even in the face of rosters that suggest that victory is unlikely.

But just as the trouble at the VA started when treating patients was replaced as an objective by meeting—or appearing to meet–time-to-appointment goals, so the trouble in Atlanta started when education was replaced as a goal by meeting test-score targets. “’Data’ and ‘accountability’ had become almost magic words,” Aviv writes: “If administrators repeated them enough, it seemed they believed that scores should rise, even if there hadn’t been significant enhancements in instruction.

It’s not an uncommon phenomenon, Aviv was told by John Ewing, the former executive director of the American Mathematical Society. Ewing is perplexed by educators’ ”infatuation with data,” Aviv writes, “their faith that it is more authoritative than using their own judgment. He explains the problem in terms of Campbell’s law, a principle that describes the risks of using a single indicator to measure complex social phenomena: the greater the value placed on a quantitative measure, like test scores, the more likely it is that the people using it and the process it measures will be corrupted.

Like the VA malefactors, the Atlanta teachers who leaked test questions and changed answers (a small fraction of all Atlanta teachers) had choices. They made the wrong ones and will suffer the consequences.

But they weren’t the ones that set unrealistically high expectations–67 percent of its students had to score satisfactorily in language arts and 58 percent in math or the school could be closed–for children who were still sounding out words in seventh grade. And they weren’t the ones who, by labeling their policy “no excuses,” implied that references to the burden of under-education that students brought with them were “excuses” made by lazy or incompetent teachers.

Under-educated students can be brought up to grade level. The KIPP charter schools have had some success doing it: not by establishing unrealistic ‘no-excuses” goals, but with significantly longer school days, weeks and years and with teachers who give their students their cell-phone numbers for evening and weekend consultation; and not in a year, but over several years.

It is also worth saying, though, that this regimen of benign-Stakhanovism takes its toll on those who are charged with carrying it out. Steven Brill, in his generally pro-education-reform book Class Warfare, follows the school year of a driven young KIPP assistant principal–only to report in his closing chapter that she resigned at the end of the year. “I know I can’t do this forever. And if I know had a child I couldn’t do it. As it is, it is screwing up my marriage,” she told Brill. “This wasn’t a sustainable life.”

The point is that bringing under-educated students to grade-level and beyond takes years of smart and hard—perhaps unsustainably hard—work. It will not be accomplished by setting goals and blaming teachers if their students fall short. And if—when—administrators nevertheless set such goals and sanction teachers and schools when the goals are not met, those who set the goals should join the teachers in taking the fall.

WANT TO GET TO THE ROOT OF THE VA SCANDAL?

FOLLOW THE MONEY

When—or should it be if?—the inquiry into the Department of Veterans Affairs scheduling scandal moves beyond identifying which VA officials’ heads will roll, investigators will start working their way up the agency’s ladder of responsibility.

They’ll start with the intake staff who handled veterans’ applications for treatment. They’ll move on to admins who entered appointment data they knew were intended to obscure how long veterans were waiting to get an appointment with a doctor. They’ll look at supervisors who reported that veterans who needed attention were being seen in a timely manner. Those who did wrong will be punished, and they’ll deserve it.

And then, if they follow the advice that Deep Throat gave Bob Woodward in the movie of “All the President’s Men,”* investigators will follow the money. Because almost all the media coverage and official reports point to lack of resources—a severe shortage of doctors and other medical staff and of the funds needed to hire and retain them—as the root cause of almost all the evil that’s taken place at the VA.

At the head of the money trail, they will find the policy makers and appropriators who took the country into two wars but failed to recognize that 21st century warfare produces many fewer deaths and many more, and more excruciating, injuries, both physical and mental that require more, and more expensive, treatment Or who recognized it but failed to give the VA the human and financial resources that would be needed, in the words of Lincoln that form the VA’s motto, “to care for him who shall have borne the battle.” At the end of that road they will also find the experts who believe that lack of resources is merely an excuse made by incompetent or lazy employees who can be brought into line by goals and incentives that force people to work harder.

On the surface, it looks as though VA funding has increased substantially over the last decade and more of war. Total VA funding rose from $58 billion in 2002, before the invasion of Iraq, to $130 billion in 2013, a 142 percent increase. But a closer look shows that while VA entitlement funding for pensions and disability payments) has gone up 153 percent—as you would expect with Viet Nam-era vets reaching retirement age—discretionary funding, which pays for medical treatment, has gone up only 129 percent. In 2002, with the U.S. more or less at peace, VA discretionary funding represented 48 percent of the total. In 2013, after a flood of over 50,000 Iraq and Afghanistan wounded vets, discretionary funding represented 46 percent of the total VA budget—less than when we were at peace.

Congressional leaders admit that VA health care has been underfunded and blame the VA. “In an environment where everybody is told, ‘Keep the cost down. Don’t tell me anything costs more,’” Senate Budget Committee chair Patty Murray told Yahoo News, administrators “don’t want to be told by their bosses, ‘Don’t tell me you need more money, because we can’t say that.’… [I]f they need more money, they need to be able to tell us, because how else are we going to solve these problems?” (Congress is not always so diffident in the face of agency budget requests. Post Office requests to balance budgets by cutting service are routinely rejected, while Defense Department recommendations that weapons systems be discontinued are disregarded at the behest of defense contractors and the members of Congress who represent them.)

The underfunding diagnosis is confirmed by those who have been inside the VA. “The scheduling scandal is a symptom of a much more serious disease,” Dr. Sam Foote, a whistleblower who spent 23 years at its Phoenix hospital, wrote in the New York Times, “a mismatch between the VA’s mission and its resources.”

Rob Nabors, the deputy White House chief of staff whom President Obama dispatched to the VA to sort things out, agrees. “With regard to increasing access to care,” he reported to the president in late June, scheduling “is secondary to the need for additional resources to actually schedule—doctors, nurses, and other health professionals; physical space; and appropriately trained administrative support personnel.”

But what about the scheduling deceit and the cover-up? What about the culture of corruption? They’re real, all right, at VA and at other agencies, and in the private sector. But as VA whistleblower Foote says, they’re symptoms, symptoms of under-resourcing. I’ve never worked at the VA, but I have worked at an underfunded federal bureaucracy, and I saw how people react to not having enough people, time and money to do what needs to be done. It comes in stages: commitment, frustration, cynicism, resignation or, for some, workarounds and gaming the system, within the rules or without.

All of which doesn’t make the bad actors at VA innocent victims. They could have responded otherwise. Several whistleblowers, like Dr. Foote, for example, have come forward; more will step up. Others will surface who became discouraged and left for other agencies or the private sector. Still others soldiered on, scheduling and caring for as many patients as they could. Only a relative few of the Veterans Health Administration’s 200,000+ employees, I am confident, will be found to have cooked the books to keep their jobs or to get promotions and bonuses. There will be more than enough tumbrils for them.

But a share of guilt, moral if not legal, should be saved for those who created the situations out of which the misdeeds arose and in which they festered: the leaders who sent 2,5 million members of the armed services to war with the VA we had instead of the VA service members needed and deserved; the administrations, from two presidents and four Secretaries of Veterans Affairs on down who didn’t request adequate funding; and the members of Congress who believed that holding the line on the federal budget outweighed caring for wounded veterans.

And save a share of culpability for all those who believe that missions, public and private, that are difficult enough to accomplish with adequate resources can be done just as well without that support. For the sake of those they serve, may their delusions never catch up with their constituents and beneficiaries the way the VA’s have caught up with wounded veterans.

THE CRUMBLING OF IRAQ:

AMERICA’S KING CANUTE MOMENT?

President Obama’s critics, more numerous than ever as his term-and political influence winds down, see the looming disintegration of the Iraqi government as proof positive that his policy toward that country and others in the region has failed.

To the contrary:  It suggests that he was right. Would the small residual force that critics think Obama should have left in Iraq have stemmed the tide of ISIS soldiers now sweeping toward Baghdad?  Or would it have required either a massive redeployment to stop ISIS or an unseemly evacuation of the residual forces to avoid defeat?  And when you look at the news footage of US armaments left in Iraq to bolster the Maliki administration but now abandoned by Iraqi defenders and being used against the Maliki administration.

Indeed, Iraq’s proto-democracy appears to be viable only so long as substantial numbers of US and other countries’ troops are there to secure democratic processes and keep warring factions from tearing each other and the country apart.  Once they depart, Iraqis’ sense of their national identity—the sense that they are parts of one country and not three, one Sunni, one Shiite and one Kurdish—is not strong enough to hold together a country whose borders were drawn by British civil servants close to a century ago.

Think of it as America’s King Canute moment.  Canute is remembered as the eleventh century Danish monarch who in a fit of vainglorious futility commanded the waves to stop.  In reality, it appears that Canute knew very well that the waves wouldn’t stop but commanded them to cease in order to demonstrate to his nobles that even a king did not have unlimited power.

Iraq’s disintegration demonstrates once again that even the United States, with its bottomless bank account, its willingness to risk and lose its soldiers’ lives and health, and its international hard and soft power cannot command centuries-old ethnic and religious chasms to come and stay together or cobble a democracy for and with peoples with no experience of self-rule.

Whatever made us think that we could?

Primarily our experience after World War II.  After forcing Germany and Japan into unconditional surrender, the US and its wartime allies assumed the political and economic governance of the two defeated powers.  Wartime leaders were executed, imprisoned or effectively exiled.  The structure of government was redesigned.  Even the two countries’ formal names were changed.  And from our tutelage, two strong and stable democracies and US allies emerged.

It has been said that success breeds confidence.  Here it bred overconfidence—the belief that we were rich and wise and resolute enough to turn autocracies into democracies and to turn enemies into friends and allies, if only we were willing to deploy enough military might, invest enough lives and money and stay long enough..

We were wrong.  Vietnam, Iraq and Afghanistan are not Germany and Japan—not already-industrialized countries, not nations with histories of self-government.  A decade of exposure to American political values, of US oversight of the forms of democracy, has not taken root.

Now we need to look from one aspect of our post-World War II experience to another.  Instead of trying to replicate our success in Japan and West Germany rebuilding countries from the ground up, we need to replicate our patience and fortitude toward the countries of Eastern and Central Europe during the 45 years between the end of the war and the collapse of the Soviet empire and the Soviet Union.

In the years after the war, Republicans made a lot of noise about rolling back communism and Soviet domination in Eastern Europe. But everybody knew that the US was not going to confront Soviet military power and precipitate another land war in Europe—especially after President Eisenhower, the former general who knew America’s military capability better than anyone else, declined to intervene in the Hungarian uprising in 1956.

But the fact that the US could not oust and replace communist regimes across Central and Eastern Europe didn’t mean that we couldn’t do anything.  The CIA had a robust presence in the Soviet Union and across the satellites.  American soldiers and American armaments reinforced the countries on the western side of the Iron Curtain.  We spent massively to stay ahead of the Soviets in the nuclear arms race.  We reached out to the other side in a succession of summits.

And we waited, while the fundamental flaws in communist economic policy undermined the economies and public support of Eastern Europe and the Soviet Union, whose citizens could see Western European prosperity just across their western borders.

Historical analogies are never more than approximate.  Our enemies in the Middle East are not states, as they were in Eastern Europe, but non-state forces.  The threats against which we must defend ourselves are not symmetrical, force against force, but asymmetrical.

But we have spent thirteen years testing the hypothesis that the way to fight terrorism is by reconstituting countries so they are less hospitable to terrorists and it hasn’t worked.  It hasn’t worked partly because, being non-state actors, terrorists can easily relocate.  And it hasn’t worked because despite our willingness to expend lives and money, ours and our adversaries’, we have not been able to remake Afghanistan and Iraq, and what stability has been achieved has not so far survived our departure.

Despite his inability to turn back the waves, King Canute, so Wikipedia says, “maintained his power by uniting Danes and Englishmen under cultural bonds of wealth and custom, rather than by sheer brutality.”  Maybe there’s a lesson there for us.

PAY EQUITY

WHAT DOES IT COST? WHO SHOULD PAY?

Glenn Kessler, the Washington Post’s fact checker, joins other analysts and commentators in finding fault with President Obama’s once again calling attention to the fact that “Today, the average full-time working woman earns just 77 cents for every dollar a man earns…in 2014, that’s an embarrassment. It is wrong.”  The 23-cent gap, Kessler says, doesn’t so much reflect outright Lilly Ledbetter-type sexism or discrimination—paying women less than men for exactly the same work—as it does life and career choices women have made: “The average woman has less work experience than the average man…[M]ore of the weeks worked by women are part-time rather than full-time. Women also tend to leave the work force for periods in order to raise children, seek jobs that may have more flexible hours but lower pay and choose careers that tend to have lower pay.”

How real are these “choices”?  How many women would work part-time or leave the workforce to raise children if good and affordable child-care were readily accessible?  How many would choose lower-paying professions if not for the barriers that still exist to the higher-paid careers?

But there are larger and more important problems with the pay gap, and with the arguments advanced by both pay-equity advocates, like the president, and critics.  The gap is not bad only because of the process that leads to it, i.e. because it may be the result of illegal discrimination. It’s bad because it means that half the workforce, and 40 percent of the country’s heads of households, earns almost 25 percent less and therefore has substantially less to spend on their own and their children’s food, housing, health care and education, and less to save for retirement, deficits that will burden their children and are likely to be passed on to their children when they become adults.

In fact, the “choices” that pay-gap defenders point to as lessening or excusing its inequity—like women staying home to care for children and holding lower-paying jobs—are doubly damaging:  They not only contribute to the gap but serve as disincentives to the performance of work that society badly needs to be done.  Parents being able to choose to care for their children is a good thing.  We should enable such choices—as long as they’re real choices—not burden them.  We need to encourage people to choose careers in lower-paying fields historically populated primarily by women, like teaching and social work, not punish them by paying them less and retarding their career progress.

The curse of the pay gap is thus not only the processes that lead to it but the outcomes it produces: families that cannot afford secure middle-class lives and that will pass on that insecurity to the next generations.

The same can be said, by the way, for income inequalities between races and ethnic groups. They are deleterious not only as the results of discriminatory processes, but as the perpetrators and perpetuators of economic and social deficits that harm not those directly burdened by them but the rest of us as well.

Of course, even a more generous federal government than we now enjoy could not legislate or fund a mandate that lower-paid jobs traditionally held by women receive compensation equal to that traditionally held by women.  But just because we can’t do everything doesn’t mean we can’t do anything.

We could, for example, expand the child-care tax credit to include people—mostly but not exclusively women—who forego paid employment to care for their own children.  We could provide tax or other economic incentives to people who enter jobs, like teaching and social work, that society needs to be done and done well and that are traditionally held by women.  Or we could give women who interrupt careers to care for children guarantees like those we give to those who interrupt careers to serve in the National Guard or the Reserves: that their jobs, salaries and seniority will be waiting for them when they return to work.

Would such measures involve costs?  Yes.  But so does the gender pay gap.  The only question is who should bear them?

PUT A LID ON THE FACILE ANTI-PUTIN RHETORIC

IT GIVES HIM EXACTLY WHAT HE WANTS

When I lived in Austin, Texas and, as a graduate of the University of Texas Law School, followed the legal environment more than I do now, I felt that I could identify a particular kind of criminal defense lawyer: hyper-active and histrionic, a magnet for the media, firing off motions for recusal and change of venue and far-fetched but admissible alternative theories like Wyatt Earp and Doc Holliday at the OK Corral.

It was my perception that this genus of attorney was most often hired by guilty people. Their motivation, I thought, was partly strategic. The swarm of motions and media reports might infect the community from which the jury pool would be drawn, or might exhaust the limited time and resources of underpaid and under-resourced assistant district attorneys.

But there was another, less practical reason, I thought: they knew they were guilty, they knew they were likely to go down, but they wanted to go down with guns blazing and banners flying.

I think of these lawyers when I hear the senatorial roars and watch the senatorial chest beating occasioned by Russia’s invasion and annexation of Crimea. Obama “needs to do something,” says Lindsay Graham. “Unless we push back soon, the worse [sic] is yet to come.” Put missiles in the Czech Republic, says John McCain.

Such displays of rhetorical aggression—and make no mistake about it, these are just displays; even McCain and Graham, not to mention an overwhelming majority of all Americans, don’t want to intervene militarily in Ukraine—could work. Obama might be intimidated. Putin might be intimidated.

But we all know they won’t work—except to make us feel better, like we’ve gone down with rhetorical guns blazing.

And except to cede to Putin one of his primary objectives for annexing Crimea and insinuating that with Crimea down, Moldova, Estonia, or Poland—anyplace in the region with an identifiable region of ethnic Russians—could be next: making Russia not a regional power with nuclear weapons, as President Obama referred to it, but once again the international power the Soviet Union used to be, the peer in power not of a Britain or France but of the US or China.

Putin has made no secret of his belief that the breakup of the Soviet Union, of which he was a creature, was a disaster. And when we look, not only at his annexation of Crimea and his threat to other former Soviet republics and satellites, but his championing of incorrigibles like Iran and Iran and Syria, they share the result of making Russia an indispensable party to the conduct of international relations, in Russia’s region and beyond. It seems especially revealing that while Russia maintains its relevance to global politics by holding itself out as a potential influencer of pariahs, its usefulness never seems to extend to actually resolving any of these situations. And of course so long as they remain unresolved, so long do those who desire resolution need to maintain relations with Russia—and so long does Russia continue to sit at the grownup table.

If that is a unifying element of Putin’s actions, it is also a unifying element in President Obama’s low-key responses. Instead of joining McCain’s and Graham’s chest beating, instead of hurling thunderbolt denunciations and implying retribution that everybody knows will either not be carried out or won’t work, Obama has treated Russia like, well, a regional power. Instead of boycotting the Winter Olympics over Russian persecution of gays, which would have framed Russia as equal in stature, Obama sent a low-level delegation of former Olympians—one of whom, Billie Jean King, wound up not going. Instead of bombing Russia with sweeping sanctions—which wouldn’t have worked and, like the grain boycott that Jimmy Carter imposed on the Soviet Union in retribution for its 1979 invasion of Afghanistan, might have hurt them as much as they hurt us—Obama imposed narrowly-targeted sanctions on a handful of high-ranking officials and oligarchs.

Did they roll back the annexation? No, but neither would anything else. What they did do is to avoid further elevating Putin, to avoid framing him as a peer to whom otherwise undeserved attention must be paid.

Putin wants to restore Russia to its Soviet status quo ante. And in fact Russia bears a striking resemblance to its Soviet-era self—a state of affairs that, we should remember, didn’t perpetuate it in power but doomed it to near-term collapse. It is still a member of the UN Security Council. And militarily powerful, yes—in the sense that it has nuclear weapons and an army capable of taking on such powerhouses as Georgia and Ukraine. It also has large reserves of oil and gas, and neighboring countries that need it.

But as former Secretary of State George Shultz and former Sen. Sam Nunn point out:

…these assets are also potential liabilities. The Russian economy depends on these trading and financial arrangements and on income from oil and gas sales that are now taking place at historically high prices. Moreover, Russia has a demographic catastrophe looming in its low fertility and astonishingly low longevity rates for men, including men of working age. Many young Russians are emigrating. There is an open rebellion in the Caucasus. Russia shares a long border with China, with hardly anyone and large resources on one side and a lot of people on the other. Putin also has a restive population, as shown in an odd way by the arrest of members of the band Pussy Riot who sang songs of dissent on street corners.

Compared to Ukraine, Georgia and Moldova, Russia is strong. It may yet aggrandize itself further at their expense. But in its pursuit of the status it had in the Soviet era, to return to being a super-power to be reckoned with on the world stage, Putin is acting not from strength but from weakness.

If we have at our disposal measures that will roll back or punish Russian invasions, by all means let’s take them. But thundering feckless imprecations, making threats that can’t be carried out or won’t work, bolsters Putin’s ambitions and advances his quest to return Russia to its Soviet-era stature.

WITH FRIENDS LIKE THESE…

HOW NOT TO REFORM THE POST OFFICE

Yes, I know that what used to be called the post office was converted to a quasi-independent corporation and renamed the U.S. Postal Service more than forty years ago.

That’s when things started to go wrong.

I’m not suggesting that by itself, the switch from government agency to quasi-private corporation brought the post office to the pass in which it finds itself: $5 billion in the hole last year, up from $16 billion the year before.  But that change led to a way of thinking about the post office and the service it delivers that hobbles efforts to maintain its viability.

Just talking about a loss points to the heart of the problem.  What was the Army’s operating loss last year?  How much did the State Department lose?  What about the federal highway system?

It’s absurd to ask.  Our national defense, our diplomatic corps, and the road system that ties the country together aren’t businesses selling products or services at prices that cover the cost of producing and providing them and yield a profit.  They are vital activities that are so essential that we do not pay for them by assessing their cost to their beneficiaries but rather by apportioning their cost among us all through our income taxes.  The work these departments do generates some revenue, through the sale of military equipment to other countries or as surplus, for example.  But we wouldn’t dream of downsizing the military or the diplomatic corps or closing a stretch of interstate highways because revenues failed to meet their expenses.

Why, then, do we impose such a burden on the post office?

We do it because, despite the fact that postal rates are set by the government, which makes good its shortfalls, we have come to think about the post office as if it were sort of a private business.  Sen. Tom Carper, the moderate Delaware Democrat who chairs the Senate committee that oversees the post office, and the sponsor of one of the less destructive post office reform bills, brags on his web site that his plan will allow the post office to “operate more like a business.”

What does that mean?  How do businesses operate?

They all have one thing in common: They provide goods and services to people who pay for them.  You stop paying what they charge, and they stop providing whatever they are selling.

That option, at least, appears not to be on the post office reform table—just ceasing to deliver mail unless it pays the freight.

Instead, would-be reformers propose a smorgasbord of measures like ceasing Saturday delivery, farming out post office functions to lower-paid office-supply companies, or lifting the prohibition on shipping liquor.  Anything other than directly addressing the sweeping changes that have devastated the post office’s—I bet you thought I was going to say “business model”—service model.

The post office’s fundamental problem is not that it delivers on Saturday, that it can’t deliver booze, or that it pays its employees a decent salary and fringe benefits.  Its problem is that it was founded as a system to deliver paper communications in a few weeks, then a few days, and that it has survived into an era when written communications can be delivered in a few seconds.  None of the proposed reforms address that reality in any meaningful way.

If the post office were really a business, we’d let it go the way of other businesses—like Blockbuster or Packard—whose models have been disrupted by technological developments, social or demographic change or mismanagement.

But reformers to the contrary notwithstanding, the post office isn’t a business and we won’t let it go the way of Blockbuster or Packard.  And not because of powerful postal worker unions either.

We won’t let it go because we still need an institution that does for the country what the post office did in the 18th century, when it was important enough to be the only enterprise mandated in both the Articles of Confederation and the Constitution: tie a diverse and far-flung country together without raising economic barriers that exclude those who need the service the most; something, to borrow Robert Frost’s phrase, you somehow haven’t to afford.

But saving the post office so it can do what it was created to do does not mean letting it continue on its present course.  In fact it very much means not letting it do that.  The pony express drastically reduced the time it took to send a letter from the east to west coast.  But when the telegraph came into its own, we didn’t keep horses and riders galloping between St. Joseph, Missouri and Sacramento.

To do what it’s supposed to do, the post office needs not just to cut costs, not just to reduce service but to reimagine itself in a way that brings it into the 21st century.

What might that look like?

To start with, it means identifying its most vital service and making sure that that service is supported.  E-mail, text-messaging and other electronic communications have largely supplanted personal communications—largely, but not entirely. 59 percent of all mail is advertising, 22 percent is bills and other financial communications, and 10 percent is “other.”  Nine percent is correspondence.  And while some of that is undoubtedly business-related, the 9 percent may be a good starting point as a measure of people and businesses for which electronic communications are either inaccessible or unaffordable.

What could the post office do for them?  It could help them gain access to electronic communications, for example, perhaps with rows of easy-to-operate terminals at post offices, optimized for email and minor web-browsing and e-commerce but not for gaming or streaming.  That would not only give them the accessibility they now lack but start to develop their e-communication skills.

In the meantime, the post office has to maintain low-cost and tax-supported snail-mail service for first class correspondence until it is no longer needed.  And make sure that bills pay their own way, as third-class advertising already does.

It wouldn’t be painless: Many—maybe most—of the people who staff post offices and deliver mail might find it difficult or impossible to adjust their skill sets to the new way of serving their most needy and deserving customers.  For that matter, the historical timeline of business success and failure is littered with the corpses of smart and hardworking companies that tried and failed to adapt to changing times.

People who really know about this stuff will doubtless come up with better ideas.  But they need to be directed at making a post office that can serve a 21st century country, not at keeping it on life support until it dies the death of a thousand cuts.

GATES MEMOIR: OBAMA HAD DOUBTS ABOUT AFGHANISTAN STRATEGY

NOW WHAT WERE YOU SAYING ABOUT INTERVENING IN SYRIA?

The big reveal in former Defense Secretary Robert Gates’s new memoir—controversial a week before publication—is that President Obama had doubts about the strategy he was pursuing in Afghanistan and the people who were executing that strategy.  “As I sat [in a March, 2011 White House meeting on the war],” Gates writes, “I thought, the president doesn’t trust his commander, can’t stand Karzai, doesn’t believe in his own strategy and doesn’t consider the war to be his.  For him, it’s all about getting out.”

Good for Obama.  John F. Kennedy had doubts about the Bay of Pigs invasion and went ahead with it anyway.   Lyndon Johnson had doubts about Vietnam, and not only stayed in but escalated the war.  Obama had doubts about Afghanistan—and moved toward getting us out.

Some commentators suggest that it is unseemly for Gates to publish while the president he served is still in office.

In fact, it couldn’t come at a better time.  Gates’ recollection of how our expeditions into Iraq and Afghanistan turned out couldn’t come at a better time than when the civil war in Syria is spurring calls for Obama to do something—just what he should do is seldom specified, but something—to bring peace and democracy to yet another Middle East country.

We threw everything we had at Iraq: shock and awe, boots on the ground, long term occupation—a much greater commitment than anything being urged on Obama in Syria.

The prospects seemed promising.  Iraq had an educated middle class.  Its oil made it economically self-sustaining. If Saddam Hussein and his army could be gotten out of the way, the field would be open for the creation of a civil society. Under our tutelage, Sunni, Shia and Kurds could learn to live together in peace.  We wouldn’t be there long.  Secretary of Defense Donald Rumsfeld was so confident that everything would quickly fall into place that he made no plans for a prolonged occupation.

He was wrong.  Colin Powell was right—we broke Iraq and we owned its consequences: Eight years of war and occupation.  Over $2 trillion spent—about thirty times the original estimate of $50-60 billion—1.5 million troops deployed, 4,500 killed, 32,000 injured, some grievously, many permanently.  And that’s just Americans.

Today Iraq is free of Saddam—an unalloyed good.  There have been three rounds of elections—three more than were held in the 80-plus years since the British carved Iraq out of the Ottoman Empire after World War I.

And after all that, after the favorable auguries, the heavy investment, the ouster of Saddam, cities with once familiar names, like Fallujah and Ramadi, are back in the headlines under siege—this time by Al Qaeda, which wasn’t even a factor before the US intervention. Iraq is reported to be on the brink of civil war.

With the lessons learned in Iraq—which we fought with the lessons learned in Vietnam—we’ve been in Afghanistan even longer.  We’ve sent over 1 million troops, spent over $640 billion, absorbed more than 2,000 deaths and almost 18,000 injured.  And from the alarm that greeted President Obama’s threat to withdraw completely if Afghanistan president Hamid Karzai didn’t agree to terms by the end of 2013, nobody on either side is confident that, without a strong US presence, Afghanistan can resist becoming a haven for terrorists—the whole reason for our going in and staying–or even cohere as a government.

Iraq and Afghanistan are better and freer than they were before we intervened—for now, at least.  But we’ve learned that our military might, our economic investment, our nation-building know-how, and our readiness to sacrifice the lives and health of our armed forces are not sufficient to turn Iraq into a democracy or Afghanistan into a nation.

And we should have learned that while we should do what we can to be helpful to the constructive—or at least the less-destructive—forces in Syria, and while we should do what we can to succor the victims of the civil war, we must be very modest about the ability of any intervention to change reality.

PANIC ATTACK:

THE REAL THREAT TO OBAMACARE

It looks like Obamacare will survive the demands of Tea Party Republicans in the House of Representatives to defund it as part of keeping the government open or raising the debt limit.

But it faces, I believe, a much more serious threat throughout the fall and into the new year, as it takes effect.

Proponents have always assumed that as Obamacare becomes real, as people get protection they’ve never been able to afford, that people’s fears will lessen as they always have with new social programs.  Opposition will melt away or become marginalized.

Well, it could—and, by rights, should—happen that way.

But history suggests another scenario.  It’s not a happy one.  But it’s happened before.

In 1988, Congress passed, and Ronald Reagan signed, a bill that provided prescription drug coverage under Medicare.  It was financed, not by increasing the Medicare payroll tax—or even by adding the cost to the deficit, the way they did for President George W. Bush’s prescription drug benefit—but by a tax on Medicare recipients, with the most prosperous recipients paying the highest rates. The bill passed both houses by overwhelming margins, 328-72 in the House and 86-11 in the Senate.  (Yes, a bill passed with bipartisan support: the past was a different country; they did things differently there.)

But when the law took effect, senior citizens discovered, although it had been there all along, for anybody who had been paying attention, that they—and only they, not everybody, like the rest of Medicare—would have to pay for the new benefit, a little at first, more later.

They didn’t like it.  And they liked it even less after they were rabble-roused by a coalition led by the National Committee to Preserve Social Security and Medicare, a non-profit headed by Franklin D. Roosevelt’s son, former California congressman James Roosevelt, which specialized in direct-mail exhortations to senior citizens claiming that these programs were in financial jeopardy.  Sound familiar?

The revolt of seniors against the drug benefit—or, more accurately, against paying for it—was epitomized by TV news  reports of Chicago congressman Dan Rostenkowski fleeing from a senior citizens forum.  The Chicago Tribune’s account is priceless:

Shouting “coward,” “recall” and “impeach,” about 50 people followed [Rostenkowski] after he left a meeting…Eventually, the 6-foot-4-inch Rostenkowski cut through a gas station, broke into a sprint and escaped into his car, which minutes earlier had one of the elderly protesters, Leona Kozien, draped over the hood.

The video’s even better.

Just sixteen months after it was passed, the drug coverage (and other health care provisions that were part of the same bill) was repealed, by margins almost as lopsided as those by which it had passed, 360-66 in the House and 73-26 in the Senate.

Like the drug benefit of 1988, the advent of Obamacare this fall and early next year will offer its share of surprises.  Just as in ’88, some will be baffling.  Consider that after almost three years of saturation coverage and outreach by friends and enemies alike, almost a third of those surveyed for the Kaiser Health Tracking Poll didn’t know if the Affordable Care Act was still on the books, while an additional 13 percent believed that it was no longer in effect, repealed by Congress or overturned by the Supreme Court.

Then there will be, as Donald Rumsfeld might say, the unsurprising surprises.  After Obama’s promise, which will be played in heavy rotation on the Fox News Channel and the internet, that everyone with employer-provided health insurance could keep it, a succession of employers will discontinue their employee coverage, leaving their employees to apply for coverage on an Obamacare exchange.

They and others without employer-sponsored insurance will also discover that, even under Obamacare, you get what you pay for.  You can get a plan for $100 or $200 and change, depending on where you live and how much you make, but it will pay for just 60 percent of your medical costs.  Want better-than-bronze coverage—silver, gold or platinum?  You’ll pay silver, gold or platinum premiums.  It’ll still be a good deal, but it won’t seem like it, especially if you’re used to your employer paying the premiums and their not showing up on your pay stubs.

And the computer glitches!  There will be computer glitches; there always are, especially with large, interactive systems.  They will be growing-pains kinds of computer glitches and they will be remedied.  But they’ll still be maddening.  And if you’ve put off your search till the last minute, just before your old plan expires, the glitches will be scary too.

And hovering over Obamacare’s first weeks and months will be the same forces that have hated it (and him) from the beginning, the groups that in 2010 mau-maued congressmen over Obamacare at town hall meetings and morphed into the Tea Party.  And their outcry will be amplified by the media that abetted them in 2010, doing their best to fan sparks of irritation into flames of anger and then a blaze of rage.  Or even, if they have to, creating sparks by rubbing dry sticks together.

Did I say “will be”?  They’re already doing it.

As Jonathan Chait of New York magazine has written, anti-Obamacare forces have

[W]hipped themselves into a frenzy over the law through a process of self-deception. The conservative-media world is both completely obsessed with Obamacare and creating a news cocoon in which the most important news about the law — the lower-than-expected premiums and sharply falling health-care inflation — doesn’t exist at all, and the fate of the law can instead be tracked through a procession of exaggerated or completely imaginary events all showing its rapid collapse. Conservative news sites churn out new Obamacare collapse stories every single day, creating the impression of the law’s continued and unmistakable destruction.

And all this will take place during the opening weeks of a congressional election year.  Members of Congress will be confronted at town hall meetings with real or Snopes-worthy accounts of Obamacare problems and asked if they still support the law.  Would they vote for it again?  Will they pledge to repeal it if they are (re-)elected?

That will be the moment of peril for the Affordable Care Act.  Will Obamacare’s proponents respond effectively?  Will supporters in Congress, many running for re-election, hold firm?

Or will they stampede, as they did in 1989?  Will Democrats in swing districts and states reach out to the president, pleading for at least a delay in the individual mandate, the way the employer mandate was delayed, just until the election is past. Will Republicans, as they did in 2010, smell blood in the water, and try to cobble together a coalition of energized Republicans and frightened Democrats, a coalition strong enough to repeal Obamacare, even to override a veto?

Not because Obama care isn’t working—it will.  Not because it costs too much—it doesn’t.  And not because it’s too complicated—it isn’t, at least not for what it does.

But because “a lie gets halfway around the world before the truth has a chance to get its pants on.”  Because the Republicans’ frenzy against Obamacare is not countervailed by a Democratic frenzy for it.  Because, heading into his last two years with his poll ratings dropping, the president will be in no position to offer political protection.

And because members of Congress will remember or be told or shown how in 1989 a health care bill that passed Congress and was signed by a Republican president was, before the election cycle turned, repealed by equally overwhelming margins.  Like Brecht’s Galileo, who recants not under torture but upon merely being shown the instruments of torture, they will look at the tape of Dan Rostenkowski being chased down the street by his constituents, and they will see themselves.

How likely is such a gruesome scenario?

At the end of Dickens’ Christmas Carol, Scrooge begs the Ghost of Christmas Yet To Come for surcease.  “Are these the shadows of the things that will be, or are they shadows of things that may be, only?” he asks.  When the Ghost remains silent, Scrooge grasps the Ghost’s arm and desperately pleads for reprieve—at which point the Ghost dwindles into a post on Scrooge’s bed.

It’s just a bad dream—but one with a warning.

« Previous PageNext Page »

viagra