Saturday, March 28, 2009

Anti-Careerists in Office
I haven't read the speech yet--I just downloaded it today--but given both my admiration (bordering on man-crush) for the guy and the fact that I come to this question pre-convinced from work I've done recently, I imagine I'll be fully behind Sen. Jim Webb's proposal this week to fundamentally reform the American prison system. Writing in Salon, Glenn Greenwald praises the Senator's courage and leadership in taking on an issue without obvious political upside, and makes a point that probably should be obvious, but isn't:

Our political class has trained so many citizens not only to tolerate, but to endorse, cowardly behavior on the part of their political leaders. When politicians take bad positions, ones that are opposed by large numbers of their supporters, it is not only the politicians, but also huge numbers of their supporters, who step forward to offer excuses and justifications: well, they have to take that position because it's too politically risky not to; they have no choice and it's the smart thing to do. That's the excuse one heard for years as Democrats meekly acquiesced to or actively supported virtually every extremist Bush policy from the attack on Iraq to torture and warrantless eavesdropping; it's the excuse which even progressives offer for why their political leaders won't advocate for marriage equality or defense spending cuts; and it's the same excuse one hears now to justify virtually every Obama "disappointment."

Webb's commitment to this unpopular project demonstrates how false that excuse-making is -- just as it was proven false by Russ Feingold's singular, lonely, October, 2001 vote against the Patriot Act and Feingold's subsequent, early opposition to the then-popular Bush's assault on civil liberties, despite his representing the purple state of Wisconsin. Political leaders have the ability to change public opinion by engaging in leadership and persuasive advocacy. Any cowardly politician can take only those positions that reside safely within the majoritiarian consensus. Actual leaders, by definition, confront majoritarian views when they are misguided and seek to change them, and politicians have far more ability to affect and change public opinion than they want the public to believe they have.

The political class wants people to see them as helpless captives to immutable political realities so that they have a permanent, all-purpose excuse for whatever they do, so that they are always able to justify their position by appealing to so-called "political realities." But that excuse is grounded in a fundamentally false view of what political leaders are actually capable of doing in terms of shifting public opinion [...]

Emphasis mine. I obviously agree, but I think Greenwald simplifies the story here in two ways: first, it's much easier to "affect and change public opinion" from the presidential bully pulpit than anywhere else, and indeed our most successful presidents from Jefferson to Reagan have carved out their places in history by doing this. But it's also not usually a matter of speaking truth to power, Mr. Smith Goes to Washington style, but rather moving the battleship by degrees over a period of (typically) eight years, such that by the end of a two-term presidency the middle of public opinion is well to the left or right of where it was at the beginning on questions of executive power, or slavery, or government intervention in the economy. Greenwald's stridency on this point suggests to me that he would have been as critical of the alleged equivocation and incrementalism and moral cowardice of Lincoln in 1861, or FDR in 1933, as he is of Obama today. The national mind never changes in a thunderclap of brilliant oratory; it moves in nudges, through compromises, and almost always with the wind of circumstance (Civil War, Great Depression, stagflation) at its back.

And so it is with Webb in this instance. He's not the president, of course, but he's making a case that would have been much more difficult to make two years ago, before the strain on public budgets made liberalizing corrections policy an economic imperative even for many who don't have a problem with overly punitive incarceration policies. And George W. Bush of all people gave him some political cover last year by signing the Second Chance Act, which signaled a new center of federal thinking around the moral claims of the prison population (around re-entry, in that instance). Not to take anything away from Webb, whose belief in prison reform goes back decades, and who really is taking a political risk in a law-and-order state where he won by the narrowest of margins--but to compare him to Russ Feingold in 2001, as Greenwald does, seems to stretch the point.

Greenwald's thought about why Webb is taking this unusually brave stand is maybe more interesting:

It may be unrealistic to expect most politicians in most circumstances to do what Jim Webb is doing here (or what Russ Feingold did during Bush's first term). My guess is that Webb, having succeeded in numerous other endeavors outside of politics, is not desperate to cling to his political office, and he has thus calculated that he'd rather have six years in the Senate doing things he thinks are meaningful than stay there forever on the condition that he cowardly renounce any actual beliefs. It's probably true that most career politicians, possessed of few other talents or interests, are highly unlikely to think that way.

I suspect this is exactly right, and that Jim Webb's self-image is bound up with his lofty office to a lesser extent than just about all of his colleagues. As a much-lauded soldier, novelist, journalist, and filmmaker at different points of his career, as someone who'd voted in 2000 for the guy he ran against and defeated six years later, he's probably both less institutionalized and less partisan than anyone else up there. It wouldn't shock me if he didn't even run again in 2012; my guess is that Webb came to the Senate thinking he would do three or four things, including an updated GI Bill (done last year) and prison reform and maybe some things on economics, and then retire to write more books. This isn't to say that he doesn't have an enormous ego like the rest of them, but it's based on other things.

Public officials like Webb or Mike Bloomberg or arguably Arnold Schwarzenegger seem to operate from a different starting point. Certainly Bloomberg and Ahnuld are different creatures, careening wildly across the ideological map and gratifying and pissing off just about everybody at one point or another. But they all do seem to have more appetite for big challenges and political risk-taking than their more traditional and career-oriented colleagues. And it's perhaps not a coincidence that none of them has ever lost an election.

Thursday, March 26, 2009

The Mini-Series Writes Itself
Nate Silver makes an observation:

Because of a law that the Congress passed in 2001, the estate tax, which at is 45 percent this year with an exemption up to $3.5 million in assets, will be entirely repealed in 2010 before abruptly returning to its former rate of 55 percent rate in 2011.

Think there might be a few rich grannies pulled off a few respirators on December 31, 2010?

See, I think it's funnier if you take the word "rich" out of there. Because it's not like public understanding of the "death tax" is exactly comprehensive: while the truth is that fewer than 2 percent of estates are subject to the tax, which as noted above kicks in only at $3.5 million, thanks to a relentless and effective campaign paid for by some of the country's wealthiest families, the perception is that it affects a far larger share of the public. This, of course, significantly goosed support for total repeal of the tax--which was the point. A 2006 paper on the estate tax and public opinion by three researchers at Yale provides some of the details:

People know very little about estate tax levels and rates and rules, as evidenced by a January 2000 Gallup poll, in which most people (53 percent) admitted they simply did not “know enough to say” whether the “federal inheritance tax” was too high, too low, or about right. Obtaining accurate information can be difficult, especially when people have an incentive to mislead you. With little background knowledge, many people seem to guess that nearly everyone is taxed at death—a misperception sometimes encouraged by question wording. For example, in a 2003 National Public Radio / Kaiser Foundation / Harvard Kennedy School (henceforth NKK) survey, two–thirds of respondents either thought “most people have to pay” the estate tax (49 percent) or said they did not know (18 percent); and 62 percent of those opposing the estate tax said one reason was because “it affects too many people.” Controlling for socio–economic and demographic factors, and general attitudes towards the tax code, Slemrod (2003) uses results from this survey to estimate that the misconception that most families pay the estate tax “increases the likelihood of favoring abolition by 10.6 percent.”

I also remembered, in the course of Googling around on this subject--yes, I'm quite bored this evening--that Larry Bartels released a paper years ago titled "Homer Gets a Tax Cut," which of course is both wonderful and awful. Bartels' findings suggest that as a people, we're not only ignorant but painfully illogical as well:

The results of my analysis suggest that most Americans support tax cuts not because they are indifferent to economic inequality, but because they largely fail to connect inequality and public policy. Three out of every four people say that the difference in incomes between rich people and poor people has increased in the past 20 years, and most of them add that that is a bad thing—but most of these people still support Bush’s tax cuts and the repeal of the estate tax. People who want to spend more money on a variety of government programs are more likely to support tax cuts than those who do not, other things being equal. And people’s opinions about tax cuts are strongly shaped by their attitudes about their own tax burdens but virtually unaffected by their attitudes about the tax burden of the rich—even in the case of the estate tax, which only affects the wealthiest one or two percent of taxpayers. Some of these peculiarities appear to be mitigated by political information, but others seem perversely resilient.

So here's my pitch. It's December 22, 2010, and four lower middle-class couples in Anytown, USA are having a pre-Christmas dinner party. The mood is a bit subdued because three of the four couples have elderly parents in the hospital with various ailments that are chronic but not immediately life-threatening. Their costs for care are rising, with familial assets dwindling down. As the evening wears on and everybody starts to get a little drunk, somebody imperfectly remembers a campaign commercial from the midterm elections a couple months before about the insidious Death Tax and how it would ruin family farmers and small businessmen--but it doesn't kick back in until New Year's Day. Another guest observes that his father was a farmer once, so he probably would see the damned gummit take everything if Dad is still around when Dick Clark gets wheeled out. (This naturally assumes that Dick's heirs aren't having the same conversation, even though it's actually relevant for them...) And, goodness, a couple of their parents had owned, or still own, small businesses: they might be failing, but this would be the last nail in the coffin--and as this metaphor is voiced, everybody suddenly goes quiet... and then they burst out laughing. The plot is on!

Saturday, March 21, 2009

To Praise and Bury BSG
Since there will never be a more congenial platform for excessive geekery than one's own blog, I'd actually planned a series of posts on "Battlestar Galactica" this week to celebrate the show's achievements and mark its conclusion. These were to include my twelve favorite BSG episodes (twelve Cylon models, twelve colonies...) ever and my surprisingly (or perhaps embarrassingly) detailed alternative vision for the fourth and final season, which I think the writers screwed up to a great extent. But for whatever reason--sloth, I guess--it didn't happen. Instead I'll just share some thoughts on what made the show intermittently great, and more than occasionally disappointing, and try to articulate where I think it sits in the historical evolution of television and science fiction.

While I don't really intend to get spoiler-y and dissect plot points and raise the many unanswered or unsatisfactorily answered questions left at the show's end, if you haven't seen the finale and plan to do so, it still might be wise to avoid reading all this until afterward.

The frustrating irony of Battlestar Galactica was that over its four seasons the show came to focus primarily on its mythology--the who/what/when/why questions of the Final Five and Starbuck's death and return and Earth--and the personal and emotional journeys of its characters. But in my view, the show really achieved greatness only when its story arcs focused on conflicts within the remnants of humanity, typically setting up two morally compelling positions held by individuals or factions of influence that proved to be irreconcilable. Ultimately what drove the narrative was the theme of actions and consequences: the genocidal attack that began the story, wiping out more than twelve billion humans and leaving the race all but extinct, was prompted, if not merited, by the humans' enslavement of the Cylons decades before. As the line of "personhood" between human and Cylon was blurred and finally erased over the course of the show, the question became whether the remnant of humanity, and/or their pursuers, deserved to survive. The show worked out its answer primarily and most effectively through the often fatal clashes between its human protagonists.

In the first season, these focused on collisions between military and political leadership that revolved around process versus outcomes, means that were questionable or outright deplorable set against ends that were not just desirable but necessary. In the episode "Litmus," then-Commander Adama refused to allow a "witch hunt" through the military court of inquiry, even though that witch hunt might have yielded knowledge about Cylon infiltration and sabotage that would have saved lives... and left the viewer wondering, hoping really, if Adama's bet on retaining the bonds of trust among his military personnel would prove a wise one. Later in the season and into the next season, starting with "Kobol's Last Gleaming, Part 1," simmering disagreement between the Commander and the president of the Colonies--a woman unelected and possibly unstable, and in the grip of religious certitude fueled by knowledge of her own impending death from incurable cancer--split the fleet (and the story) into thirds, and provided the narrative foundation for a series of tightly written, excellently acted episodes.

Thus the template was set: BSG stories that placed protagonists in conflict, gave respect and consideration to both sides, and built upon what had gone before within the show would create a new standard for intelligent science-fiction drama. They did it again, probably reaching the dramatic pinnacle of the series, in the middle of Season Two, when the Battlestar Pegasus showed up under the command of Admiral Cain. Where Galactica and the civilian fleet under its protection were leaving known space, fleeing the Cylons, and hoping to find sanctuary with the possibly mythical "Thirteenth Tribe" on Earth, Cain and Pegasus had ransacked and abandoned surviving civilians and were conducting a series of hit-and-run assaults on the Cylons--hoping somehow to find a way to return to the Twelve Colonies and win a war that Adama had given up as lost at its outset.

Cain's was an absolutist military rule, featuring summary executions and the torture and rape of a Cylon prisoner. In the ethos of the show, she and her crew perhaps didn't "deserve to survive"--but her leadership on Pegasus was unquestioned and unchallenged, and indeed her conduct after the Fall of the Colonies was much closer to military protocol than Adama's had been; the repeated and flagrant violations against military practice led Cain to integrate the crews of the two ships, ratcheting up tension between the commanders. She'd also enjoyed far more military success against the Cylons pursuing both ships. When a conflict arose between the crews, with Galactica personnel assaulting and inadvertently killing a Pegasus officer, Cain sneeringly dismissed Adama's urging for a military tribunal by observing that he had dissolved his earlier court of inquiry: again, chickens had come home to roost. Ultimately the battlestars came to the brink of shooting war, and subsequently Adama and Cain each plotted to assassinate the other; both stood down at the last possible second. But Cain's past sins ultimately cost her her life, as the escaped Cylon prisoner killed her.

The series reached its peak toward the end of the second season, starting with a stunning episode, "Downloaded," that three-dimensionalized the Cylons for the first time and explored the evolution of this young species as well as some of the same moral questions that the humans had faced. The season concluded with a barn-burner of an election contest between the incumbent Laura Roslin, long since reconciled with the military leadership and staunchly supported by now-Admiral Adama, and her former vice-president Gaius Baltar, known to the viewer but not to the characters as the accidental author of the Cylon genocide. Roslin campaigned as the religious candidate, the leader of destiny whose importance was foretold in the sacred scriptures of the Colonials, absolutely committed to the mystical search for the possibly mythical Earth. Baltar ran as the pragmatic man of science, favoring colonization of a marginally habitable planet discovered in the course of their journey. Baltar won--but only after Roslin, previously presented as absolutely moral, was thwarted in an attempt to steal the vote. It proved to be a disaster as--again--past misdeeds led to present consequences. In this case, it was Baltar's decision to give the escaped Cylon prisoner who had murdered Cain a nuclear weapon; when she set it off, killing thousands, the blast signature alerted a Cylon fleet a light year away, and they eventually showed up to conquer the colony as the undermanned and outgunned Galactica and Pegasus jumped away.

The rescue of the humans at the start of the third season featured the best action episode of the series. But with two exceptions--the stirring trial of Baltar at the end of Season Three, and the shockingly great mutiny arc of the last half-season--this was the end of the show's moral and philosophical dimension. The second half of the series was a fairly conventional sci-fi drama, successful or not based on the quality of the stories and the craft of the presentation. Some intriguing plotlines were hinted at and fitfully pursued, and this is where I would have done things differently--made the endgame much more about Brother Cavil, Dean Stockwell's character who stood as not quite a parent to six of his fellow Cylons and not quite a child to the other five, and brought things to a conclusion on Kobol, the world from which the Colonial civilization sprang, rather than the somewhat cheesily sci-fi "Colony" on which the final battle was fought and the totally deus-ex-machina "Earth 2," a/k/a "our Earth," where the story ended. I would have killed off at least two major characters--Baltar and Roslin--as I think the story would have gained more from their deaths (martyring Baltar early in Season Four could have set up his cult as the Colonials' equivalent to Christianity, with the character Paula as his St. Paul figure) than their lingering. I felt the romantic storylines of the last season added little to the larger plot and often came across as a waste of time--the Ellen/Tigh/Caprica Six Cylon triangle in particular. It's hard to blame the show's creators and producers for lingering on characters whom they'd obviously come to love, but to me it represented their losing sight of what had made the show so frakkin' compelling through its first two seasons.

Annie and I have been watching "The Wire" all winter, and we're now almost done--near the end of Season Four. I think both the quality of this show and the way in which we've watched it--probably an average of four or five episodes a week, as opposed to the real-time manner of one per week then months or years of nothing in which we watched BSG--has colored my view of "Battlestar." Never have I seen any show that gives more credit to the viewer (sometimes probably too much credit) or was less sympathetic to its characters--killing them off or sidelining them for long stretches with something close to glee. The makers of "The Wire" understood something that BSG gurus Ron Moore and David Eick did not: that the show must be more than the sum of its parts. In the end, the obsession of "Battlestar" with those parts rendered the series less than what it once was, and what it ultimately could have been.

Which isn't to say that I'm not deeply grateful for the ride, or that I won't miss it very much.

For more on the end of BSG, including reviews from probably its two biggest fans among critics, see here and here.

Wednesday, March 18, 2009

The Disillusionment
So here we are not quite two months into the Obama presidency, and the comedown is in full effect. Some are wondering why the economy still sucks--why it's getting worse, if anything. Others are baffled and infuriated that the country is still enmeshed in two wars, and hasn't entirely rolled back the Bush/Cheney "War on Terror" policies. The president's approval ratings seem no better than average for this point in a new term, and a meme is taking hold that Obama is trying to do too much, too soon--force-feeding an activist government agenda down the collective maw, with the increasingly likely result that he'll trigger a national gag reflex.

That extremists on the right are shocked and outraged that a liberal president has embraced liberal priorities, and that extremists on the left are incoherent with fury that an essentially conservative president in liberal disguise hasn't immediately and effectively undone all the damage they perceive from the last forty years, is both to be expected and probably a plus for Obama. But like every presidency, this one is in danger of being overtaken by external events--in this case, of course, the economy. The AIG bonus payments might represent less than a drop in the raging river of money we've redirected toward failed financial institutions, but I think that if the administration fails to get them back, Obama's whole agenda is probably compromised beyond salvation. The situation makes him look weak--helpless in fact--and for once, the man's eerie calm is a disadvantage, not a strength. People want him to seem as angry as we all are.

The financial bailout in general, and the AIG bonuses in particular, has such resonance because it reinforces a view that's emerged over the last decade across lines of party, class, race, and culture: the big shots play by a different set of rules. It's the one thing that binds together Bill Clinton's infidelity, the financial scandals of 2002 (Enron et al), the dumb and pointless war the administration took us into and the subsequent alleged war crimes of Cheney and Rumsfeld, and the crash of 2008. Those at the bottom--the grunts at Abu Ghraib, the people who took mortgages they couldn't afford--suffer the consequences of their own mistakes and misdeeds. Those at the top--criminals in suits--not only get away for the most part, but are rewarded with Medals of Freedom or additional millions.

Obama's candidacy and presidency was supposed to represent, among other things, a long-overdue shift back toward meritocracy: every action would carry a consequence, all would get what they deserve, for better and for worse. It's hard to imagine anyone less deserving of rewards than the men who staffed the Financial Products division of AIG. If they get paid for their failures, it will undercut everything this president hopes to accomplish and further disillusion millions of Americans who already are wondering if they were once again sold a bill of goods.

Tuesday, March 10, 2009

Late Night Baseball Musings
I just read Roger Angell's New Yorker review of the recently released book by Sports Illustrated writer Tom Verducci and former New York Yankees manager Joe Torre, The Yankee Years. I know Angell is a legend, and deservedly so--more on that in a minute--and he's 88 years old. Maybe he's aged out of having to submit article drafts to an editor; I'm not sure how else to explain these two sentences.

Torre’s calm and presence aren’t perfect throughout “The Yankee Years” (Doubleday; $26.95), a capacious fresh account of his great run in the Bronx, which he co-wrote with the Sports Illustrated writer Tom Verducci—there’s a nice moment when he tells a Yankee president to shut the fuck up, on the phone—but trust or its poisonous absence are recurrent chords in this narrative of the Steinbrenner empire during the Yankees’ four World Championships between 1996 and 2000, and their ensuing misses or near-misses from 2001 to 2007, when Torre was cut loose in humiliating fashion. Although “The Yankee Years” can be read as urban opera, with scenes taken from the Subway Series against the Mets in 2000 and the emotional resumption of play after 9/11, plus fabulous late (sometimes late, late) post-season duets with the Red Sox, it’s also a case history of the sad physical and mental decline of Emperor George, or an M.B.A. class in radical corporate thinking and its absence in a baseball time of unimaginable financial expansion, or a further take on high-salaried egos and frail character in the steroid era of sports.

That's 185 words right there, with one breath in the middle.

But that aside, it's a wonderful piece that actually made me want to read the book, my general distaste for the Yankees and relative lack of interest in Torre notwithstanding. Or maybe it's just making me want to read Angell's books. Either way, I found this one of the more insightful bursts of prose I've read about my favorite hobby:

Yankee fans love to look back on the good stuff and keep it on permanent replay, but there’s never enough of it, because these losing nights, the killers, keep coming back and take over in our minds. In the book, it’s a rush when you reach those latter-nineties or millennial late-inning Yankee explosions and Stadium-shaking endings, like the successive-night home runs against the same pitcher, Diamondback closer Byung-Hyun Kim, in the fourth and fifth games of the World Series of 2001. Two years along, Aaron Boone eliminates the Red Sox once again in the Championship Series, with his eleventh-inning lead-off homer into the lower left-field stands. Hold it right there—only you can’t. The two biggest games in the book by far are Yankee defeats: the D-backs’ seventh-game World Series effort in 2001, when Arizona rallies with a pair of ninth-inning runs against the untouchable Mariano Rivera to win their first and only championship; and the Red Sox’ tying rally (again against Mo) in the ninth inning of that 2004 A.L.C.S. fourth game—they’ve trailed in this series, remember, three games to none, and face elimination here—and then the twelfth-inning, two-run home run by David Ortiz that wins the game and begins the tectonic shift away from the Bronx and toward Boston.

"Hold it right there--only you can't." That's just perfect. This has been the strangest baseball off-season I can remember; just the changing of the weather and lengthening of days, plus the fact that I'm not going to Florida this year, has me craving the start of the season. But how can it ever be any better than October 2008?

Angell probably would answer (and indeed implies in his characterization of the Yankees' 2006 season, when the team won 97 games, was bounced from the playoffs by the eventual AL champion Tigers, and got to hear George Steinbrenner describe their season as "this sad failure") that it's in the journey, not the destination, where joy is to be found. True enough. And I know, and I've written on The Good Phight and elsewhere, that the Phils were the beneficiaries of good fortune in their run to glory last fall. Maybe what I'm really worried about is that having (vicariously) been to the mountaintop, I won't enjoy the climb as much.

Monday, March 09, 2009

Maybe these two things I'm about to present have no connection, nothing in common. Or maybe, taken together, they're suggesting something about our future that we'd better figure out. I'm honestly not sure myself.

Item One is Richard Florida's much-discussed cover story in the current Atlantic, "How the Crash Will Reshape America." Florida brings his usual thesis to this analysis: cities with vibrant intellectual ecosystems populated by knowledge workers will thrive, or in this case suffer relatively less. New York in particular is cited as a "winner" in the post-recession economy (though how much consolation this will provide the upwardly revised estimated 270,000 New Yorkers who will lose their jobs is unknown). The southwest and the suburbs in general are losers in his scenario, bringing Florida somewhat in line with James Howard Kunstler's "Long Emergency" theory, and for somewhat the same reason: shrinking oil supplies and tighter credit markets will render headlong development unsustainable. Additionally, Florida sees the crash auguring the final disappearance of the old manufacturing communities scattered throughout the Rust Belt as well as in the non-unionized south. And though he's considerably less dire and dramatic about it, Florida again echoes Kunstler in suggesting that the crash will bring to a close the developmental patterns of the last sixty-plus years:

[W]e need to be clear that ultimately, we can’t stop the decline of some places, and that we would be foolish to try. Places like Pittsburgh have shown that a city can stay vibrant as it shrinks, by redeveloping its core to attract young professionals and creative types, and by cultivating high-growth services and industries. And in limited ways, we can help faltering cities to manage their decline better, and to sustain better lives for the people who stay in them.

But different eras favor different places, along with the industries and lifestyles those places embody. Band-Aids and bailouts cannot change that. Neither auto-company rescue packages nor policies designed to artificially prop up housing prices will position the country for renewed growth, at least not of the sustainable variety. We need to let demand for the key products and lifestyles of the old order fall, and begin building a new economy, based on a new geography.

What will this geography look like? It will likely be sparser in the Midwest and also, ultimately, in those parts of the Southeast that are dependent on manufacturing. Its suburbs will be thinner and its houses, perhaps, smaller. Some of its southwestern cities will grow less quickly. Its great mega-regions will rise farther upward and extend farther outward. It will feature a lower rate of homeownership, and a more mobile population of renters. In short, it will be a more concentrated geography, one that allows more people to mix more freely and interact more efficiently in a discrete number of dense, innovative mega-regions and creative cities. Serendipitously, it will be a landscape suited to a world in which petroleum is no longer cheap by any measure. But most of all, it will be a landscape that can accommodate and accelerate invention, innovation, and creation—the activities in which the U.S. still holds a big competitive advantage.

The imagination can run away from one pretty quickly in this scenario: I start thinking about abandoned suburbs turning into ghost towns or even tourist attractions like the frontier communities of the Old West, or vast swaths of "the heartland" reverting to nature. (Though I guess as long as the land remains arable, we'll grow food on it--just that no other jobs will be there, and hence no people.)

But the changing economic landscape brings me to Item Two: an analysis piece from last Friday's Times. In the wake of news that the U.S. economy shed another 651,000 jobs in February, there's a growing realization among analysts that a lot of the jobs we've lost are gone for good:

“These jobs aren’t coming back,” said John E. Silvia, chief economist at Wachovia in Charlotte, N.C. “A lot of production either isn’t going to happen at all, or it’s going to happen somewhere other than the United States. There are going to be fewer stores, fewer factories, fewer financial services operations. Firms are making strategic decisions that they don’t want to be in their businesses.”

This dynamic has proved true in past recessions as well, with fading industries pushed to the brink during downturns before others emerged to create jobs when economic growth inevitably resumed. But with job losses so enormous over such a short period of time, some economists argue that the latest crisis challenges the traditional American response to hard times.

For decades, the government has reacted to downturns by handing out temporary unemployment insurance checks, relying upon the resumption of economic growth to restore the jobs lost. This time, the government needs to place a greater emphasis on retraining workers for other careers, these economists say.
The stimulus spending bill signed last month includes $4.5 billion for job training. That only begins to address an area long neglected, said Andrew Stettner, deputy director of the National Employment Law Project in New York. In current dollars, the nation devoted the equivalent of $20 billion a year to job training in 1979, compared with only $6 billion last year, Mr. Stettner said.

“We have to seriously look at fundamentally rebuilding the economy,” he said. “You’ve got to use this moment to retrain for jobs.”

Naturally, I agree with Andy (whom I know a bit and have done some work with). If anything, I think he understates the extent of federal disinvestment in job training over the last three decades: as I recall, as a nation we spent closer to the equivalent of $30 billion in the mid-1970s, compared to less than $4 billion a few years ago. Given that the economy in the meantime has become only more reliant upon educational attainment and skills mastery, to say we've been going in the wrong direction is an enormous understatement.

But at the risk of basically repeating my last post, the problem is that "job training" isn't really what we need right now, at least not as it's come to be understood: short-term preparation, through soft skills or specific instruction, for a relatively menial job, at which point the individual receiving help pretty much is wished the best of luck. What's required is a massive investment in post-secondary education, something more like the GI Bill than any New Deal legislation. Eliot Spitzer, of all people, articulated a good idea about how to do this in a recent Slate column. (It's really too bad that Spitzer rendered himself a joke for the rest of his life; this guy was a quality public servant, even if personality-wise he was pretty much William Rawls with a lot more power.) It's hard to imagine much public sympathy for such an undertaking right now, though--despite the fact that without it, we won't be able to fully populate Florida's "landscape that can accommodate and accelerate invention, innovation, and creation."

Whether we're talking about new spatial arrangements or new economic patterns that require a fundamental reorientation of the workforce and an unprecedentedly large bet on human capital, it seems clear that big changes will be needed; things aren't going back to the way they were. I'm starting to believe that until we as a country internalize this truth, there's no chance we'll pull out of the tailspin.

Wednesday, March 04, 2009

Needed: Tools for the Times
I'm currently putting together a short commentary piece for CUF looking at the question of how New York City should spend its approximately $70 million in workforce dollars from the federal American Recovery and Reinvestment Act (ARRA). This is a big chunk of change: the city's entire allotment under the Workforce Investment Act (WIA) for FY2008 was only $66.4 million. So effectively we're getting an extra year of funding with a use-it-or-lose-it proviso attached: my understanding is that the money has to be spent by June 30, 2010. The city is likely to get an even bigger infusion of money to support public assistance programs under Temporary Assistance for Needy Families (TANF); there's a good deal of overlap in the clientele for WIA and TANF, and they do some of the same things, but for reasons probably not worth going into it's preferable for me to focus on WIA in the piece I'm writing.

The problem in both cases though is that TANF, which was signed into law in 1996, and WIA, which passed in 1998, both emphasize short-term attachment to the labor force--or in plain English, getting a job ASAP. Rather than pile on another layer of bureaucracy, the Obama administration and Congress have determined that ARRA money should flow through existing channels--so the money has to follow the rules of those two programs and is bound by their limitations. And this presents a pretty serious problem: if the programs are designed to put people into jobs quickly, and contracts with service providers are predicated upon the providers' ability to make those job placements, what happens when there simply aren't jobs into which people can be placed?

What I think should happen, and what I imagine I'll write about when this thing comes out, is that we expand the definition of a successful program outcome to reflect current labor market realities. I heard from a colleague earlier this afternoon that even Ron Haskins, the conservative scholar now in residence at Brookings who helped author TANF in the mid-1990s, now believes that public assistance recipients should be eligible for more job training than the program facilitates. (A Google search failed to yield any written evidence of this, so I'm taking it on the credibility of my source.) In both TANF and WIA, that might mean more adult education and a recognition that if an individual advances from a fifth-grade reading level to a ninth-grade reading level, it might not help her get a job today but it very likely will in two years, all else being equal. Under WIA, that could entail expanding the time horizon for training, either through the individual training vouchers now available to customers of the system or by setting up something like the No Worker Left Behind program now running in Michigan, which would amount to helping facilitate community college enrollment.

As much of a political bugaboo as this might be, I'm going to invoke Europe: when the economies of the Continent tip into recession, the common practice is that people there head back to school--waiting out the lousy job market while adding skills to boost employability and earning power later on. We should embrace that approach here as well. Indeed, David Leonhardt notes in today's New York Times that we've done this before:

[T]he third factor — education — is the most important of all. It can make the pie larger and divide it more evenly.

That was the legacy of the great surge in school enrollment during the Great Depression. Teenagers who once would have dropped out to do factory work instead stayed in high school, notes Claudia Goldin, an economist who recently wrote a history of education with Mr. Katz.

In the manufacturing-heavy mid-Atlantic states, the high school graduation rate was just above 20 percent in the late 1920s. By 1940, it was almost 60 percent. These graduates then became the skilled workers and teachers who helped build the great post-World War II American economy.

Allowing programs such as WIA and TANF to more robustly support educational attainment while the economy continues to shed jobs would represent not a retreat from the "work-first" principles that informed TANF and WIA (putting aside for the moment the ongoing doubt on the part of liberals that work-first is such a good idea; another argument for another day, hopefully when there's actually some work to make it a non-academic discussion), but an investment in participants who will be able to command higher-value work when job growth resumes.

Monday, March 02, 2009

Uncharted Territory
It occurred to me this evening that if the Obama presidency fails, there's a very good chance that the next president will be neither a Democrat nor a Republican.

Now, this isn't to say that, 44 months before the next presidential election, we have the faintest idea about how Obama will fare. But I don't think that Matt Miller is wrong to assert that Obama's program and priorities, expressed most clearly (because, unlike the stimulus, it wasn't an emergency response) in last week's budget announcement, represent a profound ideological shift back toward assertive and proactive ("big") government. Nor do I believe E.J. Dionne to be incorrect when he writes that Obama's ambitious agenda carries the fate of American liberalism--which after all is getting its first real trial after 40 years of sitting on the shelf. If he succeeds--if decent growth returns by 2011-2012, a much larger percentage of the population has health coverage, and the country is on a firmer path toward environmental sustainability--then it's very likely that the Democrats, particularly their more liberal adherents, will be in the political driver's seat for a long while to come.

So we can hope, anyway. But if Obama's policies fail? Is the country really going to turn back to a Republican Party now unmistakably led if not defined by Rush Limbaugh, purveyor of Happy Meal Conservatism (and, it seems, conspicuous consumer of Happy Meals)? Unless the Republicans actually figure out how to transplant Newt Gingrich's brain into Sarah Palin's body, they'll have to take one, the other, the painfully unready Bobby Jindal (whose big problem is actually not his delivery--that will improve--but the fact that he'd be running for re-election in Louisiana and for the presidency at the same time in 2011), the intolerably dickish Mitt Romney ("winner" of a plurality in the straw poll at this past weekend's confab of right-wing junketeers), or some other loser who will have to kiss Rush's ring. Add that it'll be just four years after Bush--not enough time to forget that his policies dropped us into this hole, nor to mistake that his would-be successor will be pushing more of same--and I find it very hard to imagine a Republican winning.

No, we'll see a Savior Type--a businessman, most likely, who will blast Obama for his discredited faith in Big Gummit and lack of managerial acumen yet reject the Republicans for their cronyism, stale ideas, and fealty to social reaction. If not a businessman, then a general; I wouldn't be shocked to see an independent ticket emerge featuring one of each. Maybe this is why Mayor Bloomberg really wants a third term: 2008 wasn't his moment, but it's not at all implausible to think that 2012 could be.