December 18, 2006

Spirit of the Holidays

Part of my relationship with my mother requires that I hear a large amount of information that I hardly care about. I receive all sorts of news – good and bad – about people that I do not know, cannot remember, or never will meet. Maybe this is part of every son’s relationship with his mother.

So, a few months ago, when my mom began our conversation with “D, I just heard the worst story,” I immediately settled into my feign-interest mode. My mom then went on to tell me about an employee of the florist she works with as part of her event planning business who had been having terrible headaches and had repeatedly been put off by doctors claiming they were too busy to see him or perform the necessary tests. She told me that had any one of us – the relatively privileged and well-connected – had a similar malady, we would have had no trouble getting an appointment and diagnosis from the doctor, but because this person was poor and black and had no health insurance, he kept running into obstacles that were potentially threatening his life.

The man has since been diagnosed with a malignant brain tumor. He can no longer work. He certainly cannot afford his medical care. His family who depended on him is now taking care of him, making sacrifices in every way imaginable.

I have rarely seen my mom so passionate and compassionate about something. She was obviously upset that this man was suffering and angry that he was not being properly taken care of. And I know what happens when my mom gets angry.

Over the next couple of months, my mom did something amazing. Heroic even.

Rather than simply allowing this man to continue unaided, she, along with the two florists the man had worked for, took it upon herself to make a difference in his life. She acted on her compassion. She decided that she would not wait for someone to help this man – why wait, she concluded, when she could help him herself?

This holiday season, my mom, along with the two florists, has led a campaign to gather money for this man and his family – to be used for Christmas presents or food or health care. Their effort raised over $1,500, an amount that was matched by Target. Thanks to these efforts - efforts that were not required and that were undertaken without knowing how they would turn out - this man and his family will have the opportunity for a relatively pleasant Christmas during this most difficult of times for them.

She reported that the man was in tears as he thanked her. “I didn’t know there were people who would just give like this,” he sobbed. There were tears in my mom’s eyes as she relayed the story to me.

What my mom did is inspiring. It means far more than any of the gifts she and my dad got for me or my wife or my daughter. It captures the spirit of charity that should be at the heart of this season, during which we have the opportunity to reach out and help those not as fortunate as ourselves. And perhaps most inspiring, it shows what happens when people act. It is so easy to simply criticize and ponder the ills of the world. It is far more difficult to act. But it is only when people act out of compassion for others or passion for making a difference that things change.

Thank you, Mom, for living this lesson this holiday season.

December 11, 2006

Leaving Brown Behind - Part II

[NOTE: This is a follow up to last week's column]

During the Supreme Court hearing on the voluntary school integration cases heard last Monday, I was struck by a question posed by Justice Antonin Scalia. For the past week, I have grappled with the question and its implication, struggling to figure out how I might have answered it.

Frank Mellen, the attorney representing the Jefferson County (Kentucky) Public Schools, was attempting to make a distinction between the use of race confronted in Brown v. Board of Education in 1954 and the use of race by the JCPS plan aimed at maintaining a racial balance in schools of between 15 and 50 percent African American students. Brown was different, Mr. Mellen argued, because there existed two entirely distinct school systems, one white and one black. “That stigmatized the black children. It sent the message that the white race was dominant and superior and that the black race was inferior.”

At the word “stigmatized,” Justice Scalia piped up. He wondered whether the assumption underlying the JCPS plan was not itself stigmatizing. The JCPS plan, Justice Scalia said, was “based on the notion that a school that is predominantly black or overwhelmingly black cannot be as good as a school that is predominantly white or overwhelmingly white.” The potentially stigmatizing message sent by that assumption, Justice Scalia asserts, is similar to the message in Brown – that the white race is superior and the black race is inferior.

What Justice Scalia’s question exposes is that the JCPS plan to provide a quality education to all of its students is based upon the assumption that quality educational may not be available for students in a mostly-minority school. That assumption, Justice Scalia suggests and I agree, is potentially stigmatizing.

But the assumption Justice Scalia is so concerned about is not really an assumption at all, but a statistically-verified fact. Students in the typical mostly-minority schools do not receive the same quality educational opportunities as students in mostly-white schools or racially-balanced schools. This is measured in terms of teacher experience, teacher qualification, access to honors courses, diversity of curriculum, and many other ways.

So, after a week of wondering what made Justice Scalia wrong, I’ve concluded that he is actually right. Yes, it is stigmatizing to assume that black schools will not be as good as white schools. But, it is also stigmatizing – and far more damaging, in my opinion – to ignore the fact that mostly-minority schools typically are not as good as white schools and then confine black students to those mostly-minority schools.

In an ideal world, plans like that in JCPS would not have to exist to ensure that the most students receive a quality education. But we do not live in an ideal world. In the world we live in, Justice Scalia is likely to vote to strike down the JCPS plan. He will do so knowing that the effect will likely be an increase in the number of minority students attending mostly-minority schools. And he will do so knowing that mostly-minority schools in the United States in 2006 (the real world) do not typically provide equal educational opportunities to their students. In effect, his vote will be to send more minority students to schools providing fewer educational opportunities.

Justice Scalia would probably respond to such a charge that as a Supreme Court justice, it is not his job to consider the consequences of his decisions, but rather to interpret the Constitution. In other words, it is the world that must change, not his interpretation of the Constitution, if this unfortunate result for minority students is to be avoided.

This is precisely what makes Justice Scalia’s interpretation so dangerous. The fictional world for which he interprets law – a colorblind and ideal world – is appealing. It just is not the world in which the effect of this decision will be felt. But though Justice Scalia’s intepretation can wear the clothes of colorblindness and loyalty to the Constitution, it will achieve the exact same real world result as Jim Crow school segregation: separate and unequal schools with minority children being left behind.

December 04, 2006

Leaving Brown Behind - Part I

Today, the public schools of Jefferson County, Kentucky will take the national stage as the Supreme Court grapples with whether JCPS’s student assignment plan, a plan that takes race into consideration, passes constitutional muster. As an outsider who has studied the unsuccessful path to integration in Memphis, I hope that the Court recognizes the wisdom of taking action, as JCPS has, to achieve the goals set in motion by the Brown v. Board of Education decision more than a half century ago.

The Brown decision in 1954 famously put an end to the practice of “separate but equal” schooling. In addition to declaring state-imposed segregation unconstitutional, the Court recognized the importance of education in preparing the next generation of Americans. “Education,” Chief Justice Earl Warren wrote, “is the very foundation of good citizenship.” The Court recognized not only the necessity of outlawing legally-sanctioned segregation, but also the value of integrated schools in helping students adjust to the multiracial communities beyond the school’s walls.

Many cities, crippled by white flight from inner city school districts, have long since given up on integrated schools. Memphis is one such city. When Memphis was faced with court-ordered desegregation in the 1970s, the community essentially fractured into a black city school system, a white county school system, and a very white private school system. In the three decades since, those divisions have become seemingly permanent. Not only are the students largely separated by race, but, as is the case across the country, schools with the highest concentrations of minority students tend to perform the most poorly.

In Jefferson County, however, the community has embraced the values the Supreme Court identified in Brown. Since 1984, when JCPS began tinkering with its own court-ordered desegregation decree in order to make the schools more attractive to more families, enrollment – but more impressively, white enrollment – stabilized.

When the desegregation decree was lifted in 2000, rather than allowing its schools to resegregate as occurred elsewhere, JCPS enacted the student assignment plan that is the subject of the lawsuit to be heard today. The stated goals of the plan are to provide a competitive and attractive public school system, to maintain community support for JCPS, and to prepare students for life in a democratic and racially diverse society. The courts below found the plan constitutionally acceptable, holding that the JCPS policy of integrated schools is “both important and valid.”

Social science has shown just how important integrated schools can be to communities and, more importantly, to students. Integrated schools have been shown to produce increased levels of tolerance among students of all races. Surveys of Jefferson County students have shown high levels of tolerance – more than 92% of students reported that they were “comfortable” or “very comfortable” working with students from different racial and ethnic backgrounds.

In addition, African American students tend to perform better and attain better educational outcomes coming from integrated schools. These benefits are especially pronounced in systems, such as JCPS, where integration is voluntary and begins at an early age. Indeed, the black-white achievement gap is shrinking in Jefferson County even as it persists elsewhere.

Many other cities, including my home town of Memphis, have proven unable to meet the aspirations of Brown. In contrast, Jefferson County keeps trying. With its student assignment plan and its continued commitment to integration, Jefferson County has sought to create a system that is largely integrated and equal, while other cities have regressed to a state of separate and unequal. The Supreme Court has the opportunity to embrace, as it did fifty years ago, the ideals embodied by the JCPS plan. For the sake of Jefferson County and its students, we should all hope it does.

November 20, 2006

Protecting Our Most Vulnerable

In the twelve months since last Thanksgiving, I've added one very small (but growing every day) thing to be thankful about. For almost ten months now, I've been the proud dad of a beautiful baby girl. So far, my wife and I have been extremely lucky -- our daughter is a great and easy kid and she has been healthy and happy thus far. It is this that I am most thankful for this year.

But even as lucky as we have been, raising a child is difficult work. For those of you who are parents, I'm not telling you anything you don't already know. For those who are not parents, the difficulty and incredible sense of responsibility of child rearing cannot be fully understood until you are staring down at a tiny, helpless human being and you know that it is now your job to turn her into an independent person. The process is filled with joy, but it requires patience, sacrifice, and careful attention to the details through which babies send us signals about their health, hunger, and happiness.

This week, my thankfulness stretches even deeper as I realize I live in the city with the highest infant mortality rate among the sixty largest American cities. In Memphis, 14 of 1,000 infants die before they reach their first birthday. That's more than double the national average. More human beings die in Memphis as infants than from homicides. In some of the poorest areas of Memphis, the rate is four times the national average. Even though these children live in the most medically-sophisticated country in the world, a child born in these areas has about the same chance of reaching its first birthday as the average child born in several third world countries.

While infant mortality is often viewed as a medical statistic, it is more properly understood as a symptom of deeper social failures. The U.S. has more neonatologists and neonatal intensive care beds per capita than Australia, Canada, and Great Britain, but has a higher infant mortality rate than these countries. The problem is not that we lack the doctors or facilities. Infant mortality risks begin well before a pregnant mother reaches a hospital. They begin the moment a woman becomes pregnant and the U.S. -- Memphis, in particular -- has done a poor job of delivering information and services to the pregnant women whose children are most at risk.

The characteristics most highly correlated with an increased risk of infant mortality are poverty, lack of education, lack of access to health care, and a mother's use of alcohol, tobacco or other drugs while pregnant. Many mothers do not see a doctor during the entire term of their pregnancy until they enter the hospital for labor. They neglect a visit to the doctor because they do not have health insurance or they dont know that regular visits during pregnancy are important or they cannot get time off from their employer or a thousand other reasons. Even if our health care system removes these barriers in theory, many women are still paralyzed by them in reality. This lack of early and regular prenatal care puts infants at a disadvantage even before they enter the world.

As a society, we should be alarmed that in a country where we are capable of achieving medical miracles, so many of our fellow citizens do not have access to the information or health care needed to give their children the best opportunity to survive. Even in the best of circumstances, raising a healthy child is a difficult task. To avoid making it even more difficult, we must do better at getting information to pregnant women that will allow them to have safer pregnancies and births, and we must remove barriers these women feel toward seeing physicians early and often during their pregnancies and during a child's earliest months. We will not reach every pregnant woman and we'll never eliminate the tragedy of infant mortality, but we must do better at protecting our most vulnerable.

November 13, 2006

Jon Stewart is Saving America

Last week as I watched election returns come in and prepared myself for the analysis, re-analysis, reverse analysis, and overanalysis of the evening’s results, I found myself drawn to one channel more than the others – Comedy Central?

I found that I cared more about what Jon Stewart had to say about the election than any of the talking heads on CNN or Fox News.

And I am not alone on this. In its 10 years on the air, The Daily Show has gone from an anonymous program to a bona fide cultural phenomenon. Much of that transformation is a direct result of the sharp wit and political commentary Jon Stewart brought when he took over the show in 1999. Today, 1.5 million viewers watch The Daily Show each night and even though the show touts itself as a “fake news” program, an increasing number of Americans – especially young ones – are getting real news from it.

The Daily Show’s impact can be seen in several ways. First, Stewart has made the mundane hip – a Daily Show viewer is likely to chuckle at a joke about social security privatization, a sign that the viewer is aware of and understands the unexciting issue being lampooned. Engaging young Americans in politics has always been difficult, but Stewart’s mix of substance and humor seems to be making a dent. But the real reason The Daily Show is so successful with young people is perhaps The Daily Show’s relentlessness in its exposure of hypocrisy and doublespeak in politics. In a field where too many people take themselves too seriously, Stewart does not hesitate to point out the sheer ridiculousness of some of our leaders. Idealistic young people respond to this unmasking as they recognize the shortcomings of their government.

Not everyone thinks this is all that funny. Both the Boston Globe and Washington Post have run op-ed columns in 2006 criticizing Stewart for stoking cynicism, pointing out that increased cynicism may ultimately lead to decreased voter participation. However, these criticisms are blaming the messenger: Jon Stewart and The Daily Show are merely highlighting the flaws in our government and the way the media covers it. If cynicism is being heightened by an increased awareness of flaws in the system, it is the system that is the problem, not the reporting.

While The Daily Show has been receiving critical acclaim for years, it is now starting to receive some scholarly attention as well. During the 2004 presidential campaign, the National Annenberg Election Survey conducted a survey on campaign knowledge with questions on topics such as – get your yawns ready – social security privatization. Among young people, those who watched The Daily Show scored higher than those who watched four or more days of network or cable news.

A clue as to why The Daily Show viewers are so knowledgeable can be found in the results of another study, this one done by Julia Fox at Indiana University. Fox and two IU grad students scrutinized the coverage of the 2004 national political conventions and the first presidential debate on the national newscasts as compared to The Daily Show. The study concluded that the average amount of video and audio substance in the network news was not significantly different than the average amount of substance on The Daily Show. Doing a second-by-second analysis, the IU study found that The Daily Show had more humor than substance, but that the network coverage had more hype, such as references to polls or endorsements, than substance.

The fact that The Daily Show is becoming a legitimate alternative source for news says as much about traditional news sources as it does about The Daily Show. As the IU study concluded, neither of these sources is particularly substantive. But Stewart and The Daily Show are engaging young people in current political issues while exposing the hypocrisy that is responsible for so much dysfunction in Washington, while broadcast news offers no deeper substance and far less humor.

So maybe Jon Stewart isn’t quite saving America. At least he and The Daily Show are doing their part.

November 07, 2006

This Race Matters

One hundred fifty years ago, the South, including Tennessee, seceded from the United States in order to protect its right to maintain slavery.

Forty years ago, individuals seeking to register African Americans to vote in the South were abused, beaten, and killed.

The South’s racial history is well documented and not something to be proud of, to put it generously. Tomorrow, however, for the first time since Reconstruction, Southerners – in this case, Tennesseeans – have the opportunity to send an African American to the United States Senate. Even if it is a long overdue milestone, the potential election of Harold Ford, Jr., would be historic.

Ford is not a typical African American politician, nor is he a typical Democrat. He has run a center right campaign that has angered many on the left, but has brought him to election day in a winnable race. He has outworked and outperformed his opponent throughout the campaign and would be a dynamic leader for all Tennesseans.

Until recently, Harold Ford’s race would have been considered a major obstacle for his campaign to overcome. This campaign, however, has been less about race than one would have expected in the South. The one exception of course is the now infamous ad produced by the Republican Senatorial Campaign Committee showing a bare-shouldered white playmate mouthing “Harold, call me,” in an allusion to traditional white Southern fears of interracial intimacy. Thought it would be naïve to believe the ad did not intentionally appeal to a racist sentiment, its impact has been largely overstated by a national media keen on making news, particularly news that makes the South look like a bunch of racists. Any Tennessean who would have been swayed to vote against Ford by the ad’s racial allusion would be unlikely to vote for an African American candidate in the first place.

“I’ve never thought about race,” Ford has said. “Don’t believe for one moment just because we’re in the South that we can’t look for what’s in our best interest, and look for the person who will best serve and represent us.” Southerners, Ford seems to be saying, aren’t as backward as you’d like to think.

Whether Tennesseans will elect Ford is a tossup,.but he is the better candidate for all Tennesseans regardless of party affiliation. He is by far the more passionate, energetic, and talented politician of the two candidates. In the dealmaking and publicity-seeking that makes a successful Senator, Ford will excel. Through Bill Frist’s ascendancy to Senate majority leader, Tennesseans have learned the local, on the ground benefits to be gained by having a visible and successful Senator. Where Ford’s energy and passion will make him stand out, his opponent will likely blend right in with the other older white males in the Washington.

For the independent, moderate voter, Harold Ford is an ideal candidate. He is a pragmatic politician who seeks consensus and moderation rather than division and ideology. He is unafraid of reaching across party lines to find sensible solutions, unlike his opponent, who is less likely to vote independent of his party.

Strangely, the loudest criticism of Ford often comes from the left. Even if Ford’s policies are more conservative than some Democrats are comfortable with (and I include myself in this group), he could be part of a larger Democratic Senate takeover that would help push the national agenda. Some of the greatest progressive legislation of the twentieth century was passed with a coalition of northern and western liberals and southern moderates. Ford’s election could help create a modern revival of that coalition.

And for those concerned with making history and reaching long overdue milestones, Harold Ford just might do that too.

What Will Our Children Think

In 1967, interracial marriage was prohibited in sixteen states and polls showed that as many as 72% of the American public opposed legalizing interracial marriage. During that year, the Supreme Court ignored that public sentiment in the case Loving v. Virginia, ruling that under our Constitution, “the freedom to marry, or not marry, a person of another race resides with the individual and cannot be infringed by the state.”

We are only one generation removed from that era, but to a child of the 1980s, raised after the opening of society to women and members of different races, the ban on interracial marriage seems almost medieval. I cannot conceive people being denied marriage licenses based upon the color of their chosen mate’s skin and I think the fact that such a ban is mind-boggling to me represents progress.

I wonder what my children will think a generation from now when I have to explain the presence on tomorrow’s ballot of a constitutional amendment banning marriage of homosexuals.

I’m willing to concede that same sex marriage and interracial marriage do not raise precisely the same set of issues. Yet, the opposition to each is remarkably similar and I suspect that just as the opposition to interracial marriage has disappeared as society has become more accepting, so will the current hostility toward same sex relationships eventually subside.

The trial court that initially convicted the interracial couple of violating Virginia’s interracial marriage ban cloaked its decision in religious references, declaring that “but for the interference with [Gd’s] arrangement, there would be no cause for [interracial] marriages.” The sentiment was that interracial marriage was a sin that could create a “mongrel breed of citizens” and would deny Virginia the ability “to preserve the racial integrity of its citizens.” Yet, this visceral opposition has slowly faded away.

Opposition to same sex marriage is similarly based largely on religion as opponents seek to impose religious norms onto civil relationships. What gay individuals seek, however, is not permission from these religions to participate in relationships – after all, they are well aware of those religions that condemn homosexuality. Instead, gay individuals seek a recognition by the government of the same rights and privileges afforded to others living in loving, committed relationships and a recognition by society that they are just as worthy of marriage as any other person.

The ballot initiative here in Tennessee, like others throughout the country, seeks to prevent these relationships from receiving equal recognition by enshrining a gay marriage ban into the state constitution. The effort is not only duplicative – state statutes already ban gay marriage – but is also an attempt to deny one class of citizens the rights and benefits enjoyed by all other citizens based solely on sexual orientation. That, my friends, is discrimination and a tarnishing of our constitution.

However, just as interracial marriage has slowly become acceptable, so to will same sex marriage it seems. According to multiple polls, significant majorities of 18-29 year-olds favor some sort of recognition of equal rights for gay couples. Going even younger, three quarters of high school seniors favor legal recognition of same sex relationships.

With numbers like these, I suspect that my children – or at least my grandchildren – will look back at this era of constitutional amendments to ban same sex marriage as outdated and closed minded, which is exactly what it is.

October 23, 2006

Spin Zone

North Korea is in rarefied status among the world's evildoers these days. This is due to both the attention President Bush brought to North Korea in naming it among the axis of evil (a good thing) and the inattention President Bush gave to North Korea while concentrating all resources on Iraq (not a good thing). This month’s nuclear test revealed the result of the Bush Administration’s distraction: WMDs in North Korea. North Korea has gone from an isolated, hostile, dangerous country engaged in negotiations with the global community that helped prevent it from developing a nuclear weapon in 2000 to an isolated, hostile, dangerous country that has tested a nuclear weapon in 2006. The George W. Bush era, it would seem, has not been terribly unkind to North Korea.

Yet, to hear Bill O’Reilly tell it, North Korea is working feverishly to support the Democrats in the upcoming mid-term election. In his “no spin zone,” O'Reilly declared the North Korean nuclear test a calculated effort to influence the American election. Similarly, O’Reilly asserted that Iran is ramping up the violence in Iraq so Americans will turn against the Bush Administration. “Why does Iran want the Democrats to win in November?” O’Reilly asks. Iran, it seems, is pushing insurgents to increase the violence in Iraq not to crush Iraqi democracy, but to influence American democracy.

Normally, what Bill O’Reilly says is as meaningless as it sounds, but on this issue, it appears he is not alone in his thinking. Discussing the similarities between the recent spike in violence in Iraq and the Tet Offensive in Vietnam, President Bush threw out the election card: “There’s certainly a stepped-up level of violence, and we're heading into an election.” Lest anyone be confused as to what the President meant, Press Secretary Tony Snow explained that Bush “was making a point he’s made before, which is that terrorists try to exploit pictures and try to use the media as conduits for influencing public opinion in the United States.”

The O’Reilly/Bush argument goes something like this: North Korea/Iran/Iraqi insurgency are taking action aimed at helping Democrats win elections in the United States, so if you vote for the Democrats, you are actually helping the cause of North Korea/Iran/Iraqi insurgency.

But a look at American politics from the perspective of our enemies makes you wonder why these regimes would be so eager to end the Bush era. Assuming that the goal of American enemies is to weaken the United States, it seems that these regimes ought to be rooting for more of the same from the Bush Administration rather than working for a Democratic majority in Congress.

The Bush Administration has done more to weaken America’s standing in the world than any anti-American rhetoric from Tehran or nuclear test from Pyongyang. By thumbing its nose at international treaties, flouting international standards for the treatment of prisoners, and starting a war without an international mandate, the Bush Administration has put the U.S. in an increasingly isolated global position. That position is further weakened by the enormous commitment of resources to Iraq and the lack of short-term success there. Given the kind of bumbling they’ve seen from Bush Administration, why would North Korea or Iran really want change?

Neither O’Reilly nor Bush wants you to answer, or even ask that question. They are instead searching for any way possible to save the Republican majority by shifting the focus off the Bush administration’s record.

Given that record, Democrats do not need any help from America’s enemies to convince voters that change is necessary, no matter what Bill O’Reilly says.

October 16, 2006

Collateral Damage in Iraq

Six hundred thousand is a lot of people. There are just under 600,000 people living within the District of Columbia city limits and just over 600,000 people living in the state of Vermont. Six hundred thousand people could fill the Rose Bowl more than six times or Madison Square Garden more than twenty times.

And according to a recently-published study done by the Johns Hopkins Bloomberg School of Public Health, 600,000 also represents the approximate number of Iraqis who have died violent deaths since the American invasion in March 2003.

Six hundred thousand is a lot of people.

The official finding of the study is that over 600,000 more Iraqis have died since the invasion than “would have been expected in a non-conflict situation.” The Johns Hopkins figure, reached by studying the mortality rate of a broad cross-section of the Iraq community rather than relying on reporting from morgues, hospitals, or governments, is significantly higher than previous estimates from the U.S. military, the U.N. and various human rights organizations. The standard of error puts the number anywhere from 426,369 to 793,663.

President Bush, who has avoided talking much about the number of Iraqis dead, has publicly acknowledged that as many as 50,000 Iraqis may have died since the American invasion. When told of the 600,000 figure, the President said of the report that “the methodology is pretty well discredited.”

The same, of course, could be said of the President’s various rationales for beginning the war in the first place. The alleged terrorism connection and the threat of WMDs have been disproved. Even democracy promotion is taking a back seat after Hamas’s electoral victory in the Palestinian territories. The Iraq war is increasingly only justifiable as a humanitarian war – an effort to rid Iraq of the Hussein dictatorship and allow Iraqis to determine their own futures. Even assuming that the 600,000 figure is too high, even one fourth of that number of deaths seems terribly un-humanitarian. One hundred and fifty thousand, after all, is a lot of people.

Even though the root cause of the massive Iraqi chaos and suffering is the continued and destructive presence of elements desperate to spread violence and fear, the loss of Iraqi life is in part an American responsibility. The American failure to adequately plan for the protection of the very civilians its war was ostensibly waged to benefit reveals how low a priority Iraqi life was given in the run-up to the war. The result of that failure is tragic. Whatever the precise number, scores of thousands of human beings are no longer alive as a result of a life-or-death decision made in Washington.

So long as the American calculus discounts or ignores the collateral damage caused by American actions, we will continue to isolate ourselves and sow resentment around the world. It is difficult to believe a government that claims it is engaged in a humanitarian activity when the human toll of that activity is so high.

American credibility is just one more casualty of the Iraq war.

October 09, 2006

The New Southern Democrat

After throwing his support behind the Civil Rights Act of 1964, President Lyndon Johnson famously declared that his act had lost Democrats the South for a generation. Indeed, in Johnson’s beloved Senate, today only 4 of 22 Senators from the ex-Confederacy are Democrats. Yet, a generation has now passed and a new southern Democrat is emerging.

Representing a constituency with different values and ideals than Democrats from the Northeast or West Coast, the new southern Democrat is not a tree-hugging liberal. The new southern Democrat can be a fan of the Second Amendment. The new southern Democrat is not afraid to talk about faith and may not always endorse the full separation of church and state. The new southern Democrat is fiscally conservative, insisting on balanced budgets. The new southern Democrat understands that the majority of Americans would support bipartisan, moderate solutions to our nation’s problems, rather than extremist rhetoric from either side.

This election cycle, the new southern Democrat is embodied by Rep. Harold Ford, Jr., who is running to become the first Democratic Senator elected in Tennessee since Al Gore in 1990. With bounding energy and tireless work, Rep. Ford has turned what was once a double-digit deficit in a red state into an airtight race with several polls showing him leading Republican challenger Bob Corker. Unlike many national Democrats whose strategy has been to sit and wait for Republicans to screw up (with Republicans politely obliging), Ford has been proactive, asserting his positions in every county and striking a balance between his Democratic roots and the conservative leanings of his home state.

One would think Harold Ford’s surge would be cause for joy among Democrats seeking to reestablish themselves in the South. Instead, Ford has been consistently criticized from his left for being overly willing to take conservative positions in his effort to be elected. A group of local left-wing bloggers openly despise Ford, claiming he has abandoned his base and is no more than a Republican posing as a Democrat. What good is having a Democratic senator, these critics argue, when he acts just like a Republican?

These critics ignore the appeal and importance of the new southern Democrat. After all, it has been southern Democrats – Lyndon Johnson, Jimmy Carter, and Bill Clinton – who have led the only national Democratic victories since 1964.

Instead of welcoming a Democrat who is making connections with voters across the state and across the political spectrum, they would have a Democratic candidate who, unlike Ford, could pass their liberal-purity test. Unfortunately, that candidate could not be elected in Tennessee, providing Republicans an easy victory and a comfortable majority.

The Democratic party, including its most left-wing elements, should accept the reality of southern values and embrace pragmatic leaders, like Harold Ford, Jr., who are able to be both southerners and Democrats. The party is currently unable to compete in southern states, leaving Democrats at a major disadvantage in national elections. By enlarging the party to include the new southern Democrat rather than criticizing him for not being liberal enough, Democrats can shift not only the face of their party, but also the balance of power in the U.S. A strong coalition of conservative-leaning southern Democrats and more liberal Democrats from the Northeast and West Coast could reenergize a party still in search of direction.

The generation LBJ spoke of losing is now behind us and it is time for Democrats to get serious about competing in the South again. The future of the party – and the country – may depend on it.

October 02, 2006

In Defense of Habeas Corpus

You have probably never heard of the Uighur population in northwestern China. Uighurs (pronounced wee-gur) are Muslim-Chinese more closely aligned, ethnically and geographically, to the Afghans and other Muslim communities in Central Asia than to their traditional Chinese countrymen. In the grand scheme of world politics, Uighurs are small-time players, known mostly for being persecuted and systematically transplanted from their resource-rich homes by ethnic Chinese.

In the midst of fighting the war on terror, the United States came across several Uighurs in Afghanistan. By some accounts, the Uighurs were in Afghanistan to train to fight against the Chinese. By others, the Uighurs were seeking a way out of Chinese persecution into a friendlier environment so they could send money back to their families. By no account were these Uighurs involved in Al Qaeda or any anti-American terrorist training. Still, nearly twenty Uighurs ended up in American military custody in Guantanamo Bay, having been sold for ransom by locals to American forces.

American intelligence knew early on that the Uighurs, like many of those initially imprisoned at Guantanamo, were of little or no value in the war on terror. When it came time for prisoners to have their status reviewed, a number of the Uighurs were declared No Longer Enemy Combatant (NLEC) by the military (a more accurate designation probably would have been Never Enemy Combatant in the First Place, but let’s not get hung up on semantics). The Uighurs, however, were never told of this designation of innocence and, instead of being released, remained behind bars at Guantanamo.

Beginning in the summer of 2005, a group of lawyers assumed representation of several of Guantanamo’s Uighurs without knowing any of this history. The first step was to file a writ of habeas corpus in federal court, asking the government to declare the charges against the Uighurs or release them. The government delayed. And delayed. Ultimately, the lawyers discovered that their clients had already been designated NLEC and immediately opened up a relentless campaign for release. The campaign was complicated because the government, to its credit, refused to send the Uighurs back to China for fear that they would be tortured there. Meanwhile, the Uighurs remained in prison as the government refused all requests for temporary or supervised release of these innocent individuals. The search for a country to take the innocent Uighurs finally ended a few months ago and now there is a very small population of Uighurs in Albania.

The Uighurs’ story is one of the importance of habeas corpus. Without lawyers making noise regarding the unjustified imprisonment of these individuals, it is likely that the Uighurs would have remained in prison even longer than the nearly five years they already spent at Guantanamo. Without the power to demand charges be brought, the lawyers would have been handcuffed, unable to force the government to end an indefinite and unjustified detention.

Last week, Congress passed the Military Commissions Act of 2006, a law directly aimed at legalizing many of the actions undertaken by the Bush administration in handling so-called enemy combatants – actions declared unconstitutional earlier this year by the Supreme Court. The law is a mixed bag. Although it does require American personnel to treat detainees in accordance with the Geneva Convention (read: no torture), it does away with many of the traditional procedural safeguards meant to ensure a fair trial and prevent the executive from running amok. One such safeguard that is done away with is the writ of habeas corpus.

Passage of the Military Commissions Act is undoubtedly a victory for the Bush administration – with congressional authority, the President stands on much firmer constitutional ground than he did when these practices were first reviewed by the Supreme Court. The law, in effect, puts executive action in this arena beyond the review of the courts. However, as the experience of the Uighurs at Guantanamo shows, sometimes the executive gets it wrong. Without the courts and the protection of habeas corpus, where is there to turn next time?

September 25, 2006

Is Everything Bad Really Good For You?

What if it turned out that playing video games didn’t rot your brain after all? What if all those hours rescuing princesses or dissecting NFL defenses actually made you smarter?

This is precisely the thesis put forth by Steven Johnson in Everything Bad is Good For You: How Today’s Popular Culture is Actually Making You Smarter. Johnson has developed what he calls the Sleeper Curve, by which despite the purposes for which we seek out popular culture – distraction, entertainment – the very act of absorbing that culture in the form of video games, television, movies, and the internet carries along hidden cognitive benefits.

Johnson attempts to distinguish content from cognition, insisting that it is not what we’re thinking as we immerse ourselves in pop culture, but how we are forced to think while doing it.

Take video games. Today’s increasingly complicated video games require traits that translate to the non-video game world – decision-making, persistence, creativity, flexibility. The hours of concentration and frustration required to master these games cultivate these cognitive skills even as gamers think they are having fun.

Johnson applies a similar argument to television, film, and the internet. He compares the plots of The Sopranos to Hill Street Blues, and Lord of the Rings to Star Wars, concluding that today’s viewers are forced to follow more characters, more settings, and more storylines – and that we are smarter for it.

I’m willing to accept Johnson’s premise that today’s pop culture is more complex and demands more attention and thought than pop culture of the past, but I’m not sold that this is a good thing.

First, looking only at the how of popular culture while ignoring the what makes for an incomplete evaluation. The most vocal critics of popular culture are not concerned that pop culture is making us dumber, but that it is making us immoral or violent. Johnson acknowledges as much: “Popular culture may not be showing us the righteous path. But it is making us smarter.” A full evaluation of whether pop culture is truly good for us would look at all its potential effects and determine if the good outweigh the bad. Johnson does no such weighing.

However, the largest gap in Johnson’s thesis is that he does not address the need for moderation. Although he acknowledges on the afterword’s penultimate page that his book should not be mistaken for an extended justification for gluing oneself to a screen, Johnson does little to drive this point home. Johnson admittedly would not endorse a regimen that included playing video games to the exclusion of all other activity – exercise, homework, social interaction, household chores – but you would not know it from his book.

In fact, the very characteristic Johnson champions in today’s pop culture – complexity – makes it less likely that we will be able to pull ourselves away. It is because so much thought is required to crack a video game that a player must spend hours upon hours playing. It is because following The Sopranos requires such a full understanding of the plot and characters that we must watch, then TiVo and rewatch each show, buy the DVD of previous seasons to catch up or refresh, and check internet chat sites to discuss the plot permutations and hidden jokes.

I’m not saying that video games, television, and the internet are bad for you. I like all three. But Johnson’s book provides an incomplete evaluation of their pros and cons before boldly declaring that they are good for us. After all, what good are sharper cognitive skills if they will only be used to better consume pop culture?

September 15, 2006

Dallaire's New Mission

When Lt. Gen. Romeo Dallaire speaks of Rwanda, his voice quickens. His tone hardens. The room he is in becomes silent as his audience can feel the emotion barely hidden beneath the general’s tough exterior. He poses unanswerable questions about the decisions his soldiers confronted in Rwanda. He spares no party – including himself – in assessing how the world failed to act to stop the Rwandan genocide. On other topics, Dallaire can be charming, even humorous, but on Rwanda, there is only passion.

Underlying Dallaire’s persistent frustration, perhaps even shame, about the inaction of the global community is a belief that, as he puts it, “no human is any more human than any other.” He believes this despite the glaring contradictions in resources committed to confronting crises around the world. He believes it despite his own experience in Rwanda, where the slaughter of 800,000 was deemed unworthy of the risk of casualties from peacekeeping nations.

Dallaire tells of a young boy he encountered on a road in Rwanda amid huts filled with decomposing bodies. Fearful of a trap, Dallaire approached cautiously. Beyond the malnourished body and filthy rags, Dallaire recognized in the boy’s eyes the same thing he had seen in his own four-year-old son’s eyes when he had departed for Rwanda. They were the eyes of a human child. In the boy’s eyes and those of his son, Dallaire recognized a common humanity that sustains his belief that no human life is worth more or less than any other.

Today, Dallaire’s beliefs are being challenged, again in Africa. Although the global community has been more active in Darfur than it was in Rwanda, the results have been modest.

At the end of this month, the African Union force that has been monitoring the situation in Darfur, Sudan, will officially run out of funds and abandon the region. Although no one believes that the African Union force is adequate to fully stop the violence in Darfur, their removal would result in even greater lawlessness and suffering. The United Nations has approved the deployment of a mission in Darfur – a mission far short of the 44,000 peacekeepers Dallaire recommends – but that mission will not deploy without the consent of the Sudanese government. The Sudanese government, of course, has been complicit in the effort to displace or eliminate the African tribes suffering the most in Darfur, and the government has steadfastly refused to accept any non-African troops.

Witnessing the lack of will by the developed world to sustain the attention and pressure necessary to take effective action in Darfur, Dallaire recognizes the same double standard he encountered in Rwanda. Where, he wonders, is the rule that says it is OK to send 63,000 troops to the former Yugoslavia to contain suffering there, but it is completely unreasonable to send 44,000 troops to Darfur? Who makes the decision, he asks bluntly, that it is not worth a single soldier’s life to save thousands of lives just because of where those who will die live or what they look like?

Perhaps Dallaire is being naïve. After all, it is self-interest that drives foreign policy, not some overriding altruistic concern for humanity. Yet, how could Dallaire be naïve after witnessing the most horrific consequences of strictly self-interested foreign policy, the most rapid genocide in human history? To Dallaire, these consequences are morally unacceptable and he has made it his mission to call the world out on its policies.

When Lt. Gen. Romeo Dallaire speaks, people listen. We listen because Dallaire refuses to remain in the comfortable world of pragmatic foreign policy, wading instead into the complex realm of morality. We listen because we all know that on a fundamental level, he is right – no human life is more valuable than any other. But mostly, we listen because although Dallaire has seen the very worst of humanity, he refuses to surrender hope of a peaceful future and offers us tools with which to get there.


Check out my article in the Memphis Commercial Appeal previewing Dallaire's visit.

September 14, 2006

Facing Horrors of Rwanda Offers Crucial History Lesson

(published in Memphis Commercial Appeal - September 14, 2006)

Twelve years after the Rwandan genocide, Romeo Dallaire is still on a mission, and tonight he will bring that mission to Memphis.

Rather than allowing himself and his traumatic experience as military head of the United Nations mission in Rwanda to fade into history, Dallaire insists on reminding us of the fastest genocide in human history, a three-month period in which 800,000 Rwandans were murdered. By refusing to let go of his horrific memories from Rwanda, Dallaire has embarked on a new mission: to force the global community to confront the reasons for and consequences of inaction in the face of unfolding genocide.

In 1993, Lt. Gen. Dallaire, a Canadian officer, was deployed as head of a multi-national United Nations force charged with enforcing a fragile peace in Rwanda, an obscure African country he could not locate on a map. In early 1994, Dallaire began to understand that rather than working to sustain that peace, some elements within Rwanda were instead plotting the "extermination" of the country's Tutsi population. Dallaire pleaded with his superiors for the authority to act early to impede this genocidal plot, only to be told that such action was beyond the scope of his mandate.

Several months later, as extremists ruthlessly executed the very plot of which Dallaire had been warned, Dallaire was constrained by limited supplies, manpower and authority to effectively confront the perpetrators. Despite the limitations imposed upon him by others, the result, 800,000 murdered Rwandans, weighs heavily on Dallaire's conscience.

After leaving Rwanda, Dallaire attempted to return to a normal life, but how could he return to the world he knew before, knowing that it was the global community who forced him to sit with his hands behind his back as 800,000 human beings were slaughtered in front of him?

The immediate effect upon Dallaire was a severe case of post traumatic stress disorder that ultimately led to a medical discharge from the Canadian military and even a desperate suicide attempt. Fortunately, Dallaire has emerged from this dark period with the energy to face the history of genocide in Rwanda and apply its lessons to crises of today.

When Dallaire speaks, it is not simply to recap the history of the Rwandan genocide, although he certainly has a unique perspective and unflinching willingness to discuss the horrors he witnessed there. Instead, Dallaire tells his stories from Rwanda to expose the flaws in the global response (or lack thereof) to urge his audience to act to address those flaws and prevent their repetition elsewhere, such as in Darfur, Sudan.

In this way, Dallaire is an embodiment of the mission of Facing History and Ourselves, an organization aimed at using events of history as a lens to examine problems confronting students and communities today. Through teacher training, student symposia and community events, such as the visit by Dallaire, Facing History encourages individuals to understand how human behavior and individual choice play a critical role in shaping history.

Facing History has even reached Rwanda itself, having been part of an effort to create a curriculum for teaching Rwandan history despite a moratorium on teaching that history imposed in the aftermath of the genocide. Facing History is now charged with training Rwandan teachers to instruct students on this most sensitive topic in a way that lays a foundation for a future generation that will not have to endure such crimes. Thus far, Facing History has trained an ethnically and geographically diverse group of more than 150 Rwandan teachers, demonstrating its understanding that while the world can learn a great deal from the genocide in Rwanda, it is Rwandans themselves that must most directly confront their own history.

In the case of Rwanda, Romeo Dallaire is at once the history we must face, having played a critical role in the Rwanda narrative, and a powerful, moral voice on how that history applies to today's world.

His continued refusal to fade away serves as a living, breathing testament to what can happen when the world sits idly by in the face of crimes against humanity. This is Dallaire's current mission and the world is fortunate that he remains strong enough to accept it.

September 08, 2006

Still Haunted

There has been much talk from liberal commentators during this anniversary week that the “new normal” that was supposed to follow the attacks of September 11 was either short-lived or altogether illusory. I could not disagree more. Five years later, I remain haunted by that morning and I do not think I am alone.

For my generation, September 11 marked the first time our country had been stung and it carried with it a new and disturbing sense of national vulnerability. For the first time, I began to question American dominance in more than just a theoretical way. No longer were we impervious and untouchable. The safety that was taken for granted as I went about my life could no longer be taken for granted. After all, the attacks were aimed at civilians, just like me, going about their daily routine. The attackers were aiming at all of us.

I remember sitting alone in my apartment that morning in utter shock, spending the entire day glued to the television. I remember waking up in the middle of the night and checking the news with hopes that the whole tragedy had been somehow imagined. My law school classes were cancelled the next day, leaving me with nothing to do but worry and wait, growing more aware by the minute that the certainty of September 10 would not be returning any time soon.

What I remember most about the immediate aftermath was the unnerving sense that anything was possible. At that point, we did not know if September 11 was only the beginning of something even more diabolical. We did not know then that the next five years would pass without another attack on American soil.

In the five years since September 11, the events of that day have been used by various individuals and groups for all sorts of purposes. Both political parties have used September 11 to suit their political needs; the government has been particularly successful in capitalizing on the patriotism that followed the attacks to push various items on its agenda; a terrorism industry has sprouted; the 9/11 commission sought to provide a definitive account of what went wrong; lately, Hollywood has weighed in with movies.

But through all this, for me at least, the core of raw emotion unleashed that morning remains.

I am reminded often of the attacks – every time I see a skyscraper or an airplane, certainly every time I’m on an airplane – and each time I am returned to the fear of that morning. I find myself shaking my head, still in shock at such a traumatic event. Making the national trauma personal, I have had several dreams relating to the attacks and I have great difficulty watching coverage of the attacks.

So while it may be true that my every day life has returned more or less to September 10 normalcy – at least on the days when I am not at the airport, from which convenience departed long ago – it is not at all true that the “new normal” that followed September 11 is gone. I am still adjusting to living in a country that is vulnerable and in a world where people who know nothing about me or my beliefs want to kill me.

Five years later, I am still haunted by September 11, 2001.

September 01, 2006

Why Katrina Hurts

The past week has brought us countless stories on the state of New Orleans and the Gulf Coast one year after the most devastating natural disaster in American history. That superlative is not only appropriate based on the terrible loss of human life, displacement of entire communities and enormous damage to property, but also because with its blistering winds and unquenchable thirst for destruction, Hurricane Katrina bruised the American psyche in an unprecedented way.

Though certainly Katrina was a monster storm, it was not the first hurricane to bring extensive damage and death to American shores. The names Hugo and Andrew still make citizens in Charleston and Miami tremble. But Katrina and its aftermath cut deeper, affecting not only those along the Gulf Coast, but all Americans. Why does Katrina hurt so badly?

In one week, Hurricane Katrina exploded myths of American ingenuity and craftsmanship and exposed inequalities many of us willfully ignore, forcing us to confront the reality that we, as a country, are not exactly what we think ourselves to be.

Despite overwhelming evidence to the contrary, Americans imagine our country to be one of boundless opportunity where everyone has the chance to hit the lottery. The gap between wealthy and poor, however, continues to expand and threatens to create a permanently impoverished and undereducated class of citizens with a very low ceiling on what they can achieve in the United States. Although most of the time, the elements of the community stuck in this rut are confined to certain parts of the city and heard from only on the local news when being interviewed regarding a neighborhood crime, after Katrina, all over the country Americans were forced to confront the poverty that our nation tolerates. For several weeks, the individuals our society does least to protect were brought out from the shadows and onto their rooftops with pleas for help.

The way Katrina's human toll cut along racial and economic lines exposed for all to see the second America that John Edwards so eloquently brought to life with his "Two Americas" speeches. During Katrina, the haves, the have-a-littles, and the have-a-whole-lots got a glimpse of the have-nots struggling to get by in this country. Katrina brought America face-to-face with its greatest vice, inequality, and many Americans were shocked and repulsed. In this land of supposedly boundless opportunity where anyone can make it through hard work, Katrina drove home the fact that some Americans have it a whole lot better than others.

In addition, the completely bungled response by all levels of government to the unfolding disaster exposed all that is wrong with a current leadership class that is focused more on elections than on governing, more on appearing to help than on actually helping. Such bogus leadership escapes unmasking until a moment of crisis comes along, at which point the empty heads and suits in leadership positions are reduced to impotent spectators. Katrina did a heckuva job of lifting the mask on all levels of government failure.

Add to the leadership vacuum Katrina exposed the extreme slowness and inability of the government to either protect or rescue citizens and you get a genuine national embarrassment. Here we were, the most powerful country on the planet, unable to reach our own citizens in a major city several days after the storm.

And then there was that group of citizens, themselves embarrassing the country by taking advantage of the anarchy of the times to rob, loot and threaten for their own pecuniary gain. They reflected a culture of selfish thuggery where crime is a badge of honor and laws and law-abiding citizens are inconveniences that would best be disposed of. They are not the majority of Americans, but they are part of the American underbelly exposed by Katrina.

Why does Katrina hurt? Because it slaps us in the face with the reality that we are not doing as well as a nation as we think we are. A year later, that reality still stings.

August 25, 2006

UNfair Criticism

In the midst of the outbreak of violence in Lebanon and northern Israel, much hope for halting the war was placed on the United Nations. Now, it appears that a UN force will be charged with keeping the peace between Hezbollah and Israel. Unfortunately, the hope for stopping violence and maintaining peace is placed in a United Nations that does not and never did exist. When people look to the UN to step in and solve conflicts, they imagine a fictional world governing body with the power, will, and clear-mindedness necessary to make an impact around the world.

However, because of the very structure and authority of the organization, the UN’s true power, will, and perhaps above all, clear-mindedness, fall far short of the hopes placed on it, leaving the UN open to consistent criticism when things go badly.

The UN as it actually exists is by definition limited, constrained, handcuffed, paralyzed. Despite lofty rhetoric, it is not a world government with world peace as its agenda, but a collection of national governments with national interest as their agendas. Without the consent of the member states, the United Nations cannot even issue staplers to its employees, much less compel the deployment of troops sufficient to quell violence in Lebanon, Darfur, or anywhere else.

Yet it is the UN that is the convenient answer to all the world’s ills and the convenient scapegoat when all the world’s ills go unsolved. Yale historian Paul Kennedy, author of “The Parliament of Man: The Past, Present and Future of the United Nations,” notes that if the UN fails to bring a lasting peace to Lebanon, the consolation for the world will be that “we will all be able to blame the United Nations for being ineffectual, weak-toothed, anti-Israel or anti-Arab, and thus of no good to the world community.”

Such proclamations have become standard accompaniment to global conflicts, as though it is the UN that is the source of the world’s discord rather than the bad-behaving individual nations – or as though it is the UN itself rather than the obstructive member states within it that prevent the UN from adequately responding. Certainly the UN has its own troubles, including corruption at high levels, but the lion’s share of the responsibility for UN failures in responding quickly and substantially to unfolding crises requiring military responses rests with member states looking out for their own interests.

As we evaluate the UN, we must remember that at its essence, the United Nations is a building. It is a forum for the nations of the world to gather and discuss the problems of the day. It is a place for diplomacy and smoky room dealmaking. In this way, it is an alternative to war and it has been somewhat successful in putting global diplomacy on at least equal footing with military confrontation as a means of solving problems.

Given its actual role and mandate, the UN is very good at limited tasks – negotiating peace agreements, coordinating and delivering humanitarian aid around the world, observing elections and assisting in rebuilding infrastructure in war-ravaged countries, keeping human rights on the global radar, providing international business and legal guidelines as national borders disappear in those areas, and gathering statistics for reports that reveal trends in the world. Conspicuously absent from this list is anything related to military endeavors. Yet it is on the military front, in Lebanon today and somewhere else tomorrow, that the world seems to expect the most from the UN only to consistently be disappointed.

Paul Kennedy argues that “the U.N.’s performance can only be measured against its existing capacities and authority, not against some mythical, nonexistent strengths.” It is unfair to condemn the UN for being unable to accomplish tasks it is fundamentally unqualified to accomplish, especially when the UN remains unqualified in part because of the member states’ unwillingness to give the organization broader power. The criticism is unfair perhaps, but politically useful and not likely to end any time soon.

August 04, 2006

New New World Order

When the Berlin Wall came down, there was much discussion about how the world would reorganize itself in the post-Cold War world. During the 1990s, the United States established itself as the world’s lone superpower, leading the globe through a decade of relative prosperity. With unprecedented superiority in economic and military prowess – the two measures central to the Cold War – American dominance on the global stage seemed to have no end in sight.

Yet with the arrival of the 21st century, it became increasingly apparent that the metrics used to measure power during the Cold War no longer captured the whole story. Sometime between 1989 and 2001, a shift in how to measure global power occurred. And with this shift came the realization that though still the world’s only superpower, the United States no longer wielded the amount of power it once had. That Bush Administration has come face to face with that realization, albeit reluctantly, as it has sought to assert America’s will on the world with varying degrees of success.

As Harvard’s Samantha Power puts it, in thinking about power today, we’d be better served thinking in terms of influence. In other words, power in the 21st century depends not only on the strength of a nation’s economy and military, but on the extent to which it can affect the way the rest of the world behaves. Part of the reason American influence is on the wane, Power argues, is that influence stems from two variables that American leaders think too little about: (1) other nations’ trust that the United States will use its power legitimately, and (2) other nations’ faith that the United States is capable of achieving what it puts its mind to. American policies have dealt severe blows to each of these variables and the consequences are being felt all over the globe.

As a result of the Cold War battle for hearts and minds being waged all over the world, it was in the United States’ interest to act – or at least appear to act – in a legitimate way with respect to other countries. With the alternative being Soviet authoritarianism, the global sense was that the United States could be counted on to do the right thing. With its rich and progressive culture, the U.S. was the global good guy. With the contrast of the Soviet superpower removed, the United States is now held to a different standard. Every decision is analyzed with increasing scrutiny all over the world, analysis that is made all the easier by the instant information allowed by the internet. Considered under this microscope, there is an increasing sense that American power is not being used legitimately. With the ascendancy of the Bush administration, skeptical of international opinion and unilateral by nature, that sense has been exacerbated.

The Bush administration has also unwittingly diminished American influence by undermining the global faith that the United States can accomplish what it sets out to achieve. Although perhaps no less effective than the administrations that preceded it, the Bush administration’s failures have been particularly visible – the inability to capture Osama bin Laden, the inability to rescue American citizens after Hurricane Katrina, and, most damaging, the inability to secure Iraq.

It is, of course, Iraq that has had the most severe impact on American influence. The buildup the war occurred amid the perception that war was a foregone conclusion, a perception fed by the fact that the rationale for the war has shifted constantly. The execution of the war, meanwhile, has exposed weaknesses in the decision-making process governing the American military. The current momentum for withdrawal confirms a lack of public will often cited by the most zealous anti-Americans.

When President Bush says that leaving Iraq would send the wrong message to the world, he is correct. However, the wrong message has already been sent. As a result of American policies and the limitations those policies have exposed, we are on the brink of what was not conceivable as recently as a decade ago – a world where American dominance is no longer assumed and where stateless leaders of insurgent groups who capture the hearts and minds of their constituencies can stop the world’s superpower from having its way.

July 28, 2006

Race and Politics in Dixie

A few blocks from the heart of downtown Memphis sits the National Civil Rights Museum, one of the city's treasures. Situated at the Lorraine Motel, site of Dr. Martin Luther King, Jr.'s assassination, the Museum offers visitors the opportunity to relive civil rights history, including the chance to watch Dr. King's historic "I Have a Dream" speech in its entirety.

Exiting the Museum this election season, Memphians are being reminded both of the progress that has been made and of how far remains to go to fulfill Dr. King's dream, as two candidates from Memphis – one black and one white – seek to prove that race is no longer a bar to winning elections, even in the South.

Tennessee's ninth congressional district, of which Memphis is the majority, has been represented in Congress by an African American since 1975. This fall, the incumbent, Rep. Harold Ford, Jr., is pursuing a seat in the U.S. Senate, leaving a slew of candidates – 20 in all – vying to replace him. Among them is state senator Steve Cohen, who stands out from the pack based simply on the fact that of the presumed frontrunners in next week's Democratic primary (from which the likely general election winner will come) Cohen is the only one who is white.

During the campaign, few rivals have questioned Cohen's qualifications or commitment to the constituents of the district. Instead, as recounted in this newspaper, several have explicitly or implicitly suggested that Cohen is unfit for the seat simply on the basis of his race. Often, such quips are prefaced with qualified praise like "Steve's a good guy, but…" The unspoken yet well understood "but" is that the ninth district, which is 60% African American, should not send a white representative to Congress, no matter how qualified. One candidate even sent an email to supporters laying out what he sees as the dire stakes, threatening that "For the first time in 30 years Memphis could be without African American representation."

Such efforts to make race a qualification (or disqualification) for office appeal to the basest instinct of American politics – the instinct to make important decisions based solely on race. Playing the race card in this way, these candidates seek to simplify an important and complex congressional race into, literally, a black and white choice.

Meanwhile, Rep. Ford, the man vacating the ninth district seat, is attempting to break a color line of his own as he seeks to become Tennessee's first African American Senator. In a state that is 80% white, there has been remarkably little talk about Ford's skin color as a potential disadvantage for him. There have certainly been no ominous "Tennessee might be without a white Senator" emails from Ford's opponents. Such tactics would be roundly – and rightfully – denounced, with the loudest denunciations coming from some of the same people pleading against electing Mr. Cohen on the basis of his skin color. And while it would be naïve to believe that the color of Ford's skin does not affect the way some individual voters think of him, it has been refreshing to see a campaign by an African American for statewide office in the ex-Confederacy that is not focused on race. Regardless of whether Ford wins in November, that is progress.

The candidacies of Rep. Ford and Mr. Cohen offer 21st-century illustrations of the centuries-old intersection of race and politics in America, an intersection whose continued relevance was affirmed by this month's reauthorization of the Voting Rights Act. The opposition to Cohen based on skin color rests on the assumption, despite Cohen's two-decade long record to the contrary, that a white person cannot effectively represent African American interests, and it comes even as Ford simultaneously seems to be disproving the converse assumption: that an African American cannot represent a majority white state.

Both Ford and Cohen – as well as several of Cohen's opponents – are distinguished and qualified candidates. Both have sought to rise above divisive racial politics by asking voters to judge them by their record rather than their race, or, stated more eloquently, by the content of their character rather than the color of their skin. This is what they and the citizens they seek to represent deserve and exactly what Dr. King dreamt of more than forty years ago.

July 21, 2006

A Genocide By Any Other Name

During the 1994 genocide in Rwanda, officials in the Clinton administration went to great lengths to avoid calling the unfolding tragedy “genocide.” Rather, they chose the term “acts of genocide” apparently in order to avoid any legal obligation under the 1948 Genocide Convention to take action to stop any activity deemed “genocide.” When asked how many acts of genocide it takes to make genocide, a State Department spokeswoman answered meekly, “I’m just not in a position to answer that question.” The semantic effort was largely successful as the Clinton administration did virtually nothing to stop the murder of 800,000 Rwandans.

Human rights scholars took from this experience the lesson that language mattered – because the Clinton administration was so intent on not saying “genocide,” the scholars concluded that had the word been uttered, action would have followed.

The wisdom of that lesson has been put to the test as another tragedy unfolds in Africa, this time in Sudan. Applying the Rwanda lesson, activists pushed strenuously for the Bush administration to classify as genocide the killing and looting of African tribes in Darfur by government-backed militias. On July 22, 2004, the U.S. Congress declared that genocide was occurring in Darfur. Two months later, the Bush administration agreed, as Secretary of State Colin Powell declared “genocide has been committed….and genocide may still be occurring.” The human rights community celebrated these declarations with the hopes that significant action would follow.

Two years have now passed since the congressional declaration and although the Bush administration has taken action, far more action than did Clinton in Rwanda, dreadful and dangerous conditions persist in Darfur.

The top United Nations enjoy to Darfur, Jan Pronck, recently observed that two months after a May 5 peace agreement among many of the parties involved, the situation is bad as it had been two months before the agreement. The UN has had to halt humanitarian assistance in some parts of Darfur because aid workers have been killed, and the violence is spilling into neighboring Chad. The implementation of the peace agreement has been generally nonexistent and the African Union force deployed in the region is set to run out of funding this fall, leaving a several month gap before UN forces take over no earlier than January 2007.

In short, the experience of Darfur has proven the limitations of the lesson of Rwanda that language matters. (Apparently, the UN did not get this memo as they have resisted declaring Darfur a “genocide,” instead asserting in Clinton-esque fashion “in some instances individuals may commit acts with genocidal intent.” This statement, of course, begs the question – how many acts with genocidal intent make genocide?) Further, the failure of states to take action to stop what has been labeled “genocide” reinforces the fundamental weakness of all voluntary international agreements, such as the Genocide Convention – enforcement. Unless there are consequences for failing to abide by a legal obligation to act under the Genocide Convention, tempered action like that taken in Darfur is the likely outcome.

Not that the Bush administration’s declaration of “genocide” didn’t matter. To be sure, it represented a turning point in American engagement on the issue and put the U.S. at the forefront of the effort to rein in the killing. However, it was far from the trigger to prompt action sufficient to stop the genocide, as the Clinton administration feared.

It seems that each time a new genocide unfolds, the lessons of genocides past are rendered obsolete. The world apparently has no shortage of ways to avoid effective intervention. What then are the lessons of Darfur?

The most important lesson is that publicity and an active mobilization of a constituency against genocide can happen and can move lawmakers to act. Heroic work by human rights activist turned Darfur into a somewhat mainstream topic and paved the way for the action that has been taken. Second, Darfur has shown an enormous variety of ways that non-government actors can act. The lesson that governments cannot be relied upon to act in genocidal situations has been internalized as a large group of aid organizations and volunteers have pushed the Darfur agenda further than any government would be willing to. Finally, the world has learned that semantics that arguably create legal obligations do not stop genocides. It is action that stops “genocide,” “acts of genocide,” “acts with genocidal intent,” and all things in between.

July 14, 2006

Lessons Unlearned

As the Supreme Court concluded its term, it issued a sweeping rebuke of the Bush administration's approach to the treatment of detainees at Guantanamo Bay. The decision, Hamdan v. Rumsfeld, invalidated the military tribunals the Administration had employed to try Guantanamo detainees -- tribunals that had severely limited the legal rights of detainees, including the right to be charged in a timely manner, the right to a lawyer, and the right to see the evidence against them prior to trial.

In response to the decision, the Bush Administration seemed to reverse several years of policy and accept that the protections of the Geneva Conventions must apply to detainees held in the war on terror. This apparent reversal was celebrated as a long-awaited awakening to reality for the Bush administration, who has sought extensive executive authority in prosecuting the war on terror, including the use of controversial interrogation techniques, wiretapping and surveillance plans, and detainee procedures.

The long-awaited awakening, however, was short-lived.

This week, the White House put increasing pressure on members of Congress to recreate the lawlessness at Guantanamo by passing legislation that would limit the rights granted to detainees. Administration lawyers told Congress that the most desirable solution would be for Congress to pass a law approving the very tribunals that the Supreme Court had said the President could not establish on his own.

Now that's the Administration we know and love.

Meanwhile, continuing its attempts to fill the courts with those who agree with its broad interpretation of executive power, the Administration is still backing the stalled nomination of William Haynes, to fill a vacancy on the Fourth Circuit Court of Appeals. As general counsel for the Defense Department, Mr. Haynes oversaw a policy memo that secretly authorized harsh treatment, even torture, for detainees at Guantanamo Bay. The Administration has since disavowed the memo and Mr. Haynes himself says he's "glad it's no longer on the books," but his participation in drafting and implementing such a controversial policy was enough for 20 retired military officers to send a letter to the Judiciary Committee expressing deep concerns over his nomination.

The letter speaks for itself: "What compels us to take this unusual step is our profound concern about the role Mr. Haynes played in establishing over the objections of uniformed military lawyers detention and interrogation policies in Iraq, Afghanistan, and Guantanamo which led not only to the abuse of detainees in U.S. custody but to a dangerous abrogation of the militarys long-standing commitment to the rule of law."

Rather than recognizing the danger of a policy Mr. Haynes helped establish policies the Administration has since disavowed the Administration continues to press for Mr. Haynes to be given a lifetime judicial appointment on a very important court where he can presumably sign off on all executive attempts to operate without regard to the law.

This is, of course, perfectly consistent with the Administration's stubborn refusal to admit mistakes or compromise on matters of executive authority and detainee treatment. Only with incredible reluctance has the Administration taken even the smallest steps back from its assertion that in the war on terror, the President can do as he pleases without concern for the rule of law.

However, as Justice Stevens wrote in the recent Guantanamo case, "The executive is bound to comply with the rule of law that prevails in this jurisdiction." We're still waiting on the current executive's acceptance of this reality.

July 07, 2006

World Cup Fever

For the past month, my wife, my daughter, and my VCR (remember those?) have endured my quadrennial sports infatuation, the World Cup. I have put housework and work-work to the side in order to watch as many games as possible. I have taken early lunches and late lunches that would coincide with game-watching. I have checked the Italian and English papers online to see what real soccer journalists have to say about the games. I have read two soccer books and multiple soccer-related magazine articles, enjoying the growing selection of soccer sociology titles out there. And I have been rewarded with some fantastic games, some captivating finishes, and a final involving my two favorite non-US teams, Italy and France. (Here would be a good place to write something about the disappointing American team, but I will resist the temptation to comment on a team that could not find its pulse in Germany despite probably being the best team the US has ever had – OK, I couldn’t resist. On to 2010, I suppose)

There is much cliché that accompanies the World Cup. Every four years, we are told how the World Cup is the most popular sporting event in the world or how the games are so important in every other country that employers give employees days off to watch their teams play. This year, we learned that the civil war in the Ivory Coast ceased while the Ivorians were in the tournament – sadly, they went out after the first round. And anyone who has paid attention to the World Cup has heard ad nauseum about the importance of the Cup as a national source of pride for the host country, Germany, still not quite unified nearly two decades after the fall of the Berlin Wall. Even the American apathy toward the Cup, and to soccer in general, is cliché – it is to the point that there are more stories about Americans not watching the World Cup than about the World Cup itself, perhaps the most attention ever given to not paying attention.

But since there is so much cliché already out there, it will not do too much harm for me to add my own: watching soccer, especially during the World Cup, is good for the soul.

The character traits required to live through the World Cup are good preparation for life. There is patience and focus – goals and game-changing plays don’t happen every minute, so the soccer watcher must be willing to focus for 45 minutes at a time and wait for the moments, if any, to justify his time. There is persistence as there are no commercials to disrupt the soccer watcher’s engagement. Even a trip to the bathroom risks missing a big moment and making watching all the previous and unimportant moments in vain.

The American soccer watcher must further possess a willingness and strength to stand alone, the confidence to do the unpopular thing. Perhaps even more important, the American soccer watcher must be creative and adaptable, figuring out different ways to keep up with a sport ESPN hardly cares about.

Writing from Paris after the 1998 World Cup was won by the French, Adam Gopnick wrote: “Soccer was not meant to be enjoyed. It was meant to be experienced. The World Cup is a festival of fate: man accepting his hard circumstances, the near certainty of failure.”

I could not agree more with the first point – watching soccer is an experience. For the past month, my emotions have been tied to various ninety minute matches happening halfway around the world, usually between countries I am not a citizen of. But I think Gopnick sells soccer short by describing it as “man accepting his hard circumstances.” Neither on the pitch nor in the stands nor in front of the television is anything accepted. For ninety minutes at a time, players struggle with all their skill to avoid the fate of failure or the dreaded zero on the scoreboard. For ninety minutes at a time, soccer watchers must put their soul into the match, taking a leap of faith that it will be worth it in the end. (This time around, no game was more worth it than the Germany-Italy semi-final match decided in the final minutes of overtime after nearly two hours of scorelessness – what an unbelievable rush!).

And more often than not, it is worth it in the end. The World Cup is, as Gopnick wrote, a festival of fate. However, it is not about accepting the near certainty of failure, but about engaging with the world and enjoying the ride in spite of the near certainty of failure. And that is precisely the type of thing that is good for the soul.

June 30, 2006

Sit Down if You're Blocking the Vote

In 1965, Lyndon Johnson formed an alliance with Dr. Martin Luther King, Jr., to craft and pass the Voting Rights Act. Developed in response to endless hurdles being thrown up to keep African Americans from registering to vote or voting, the Act established a nationwide prohibition against discrimination in voting. The necessity of the law despite the Fifteenth Amendment's guarantee of the right to vote to all citizens, adopted nearly a century before, is a testament to the long and largely successful history of voting discrimination in the United States.

The Voting Rights Act was one of the most important pieces of legislation enacted during Johnson's presidency, but it was also political suicide. Knowing that support for the Act would hand the South to the Republican party, Johnson forged ahead anyway. Three years later, he stunningly withdrew from the presidential race and retired.

Voting remains a critical issue in America today. The last two presidential elections confirmed the importance of every single vote in determining the leadership and direction of this country. Yet, despite this importance (or perhaps because of it), lawmakers continue to take steps aimed at restricting the ability of Americans to vote, even threatening the future of the Voting Rights Act itself.

Although many portions of the Act are permanent law, other sections must be renewed from time to time. Several important sections are set to expire in 2007, including the requirement that states get clearance from the Justice Department before making changes in their voting procedures, such as redistricting, that could affect minority voters. Affirming the importance of these procedures to protecting voting rights, a bipartisan consensus in Congress had agreed to renew the provisions this summer. It was the type of broad support such an important law deserved. However, the expected vote was cancelled last week after several southern Republicans complained that the law unfairly targeted the South. Now, it is unknown when Congress will take up the issue or whether the Act will be renewed at all over these lawmakers' objections.

The recent efforts by lawmakers to delay renewal of the Voting Rights Act are the kind of partisan politics Lyndon Johnson, a Southerner himself, rose above in initially passing the bill. Unfortunately, efforts to delay renewal are part of a nationwide trend toward making voting more difficult for more people. According to the Brennan Center for Justice at NYU School of Law, several states including, of course, Florida and Ohio have recently passed restrictions on voter registration drives despite the fact that such drives have no correlation with voter fraud. These drives simply increase the numbers of registered voters and, consequently, the number of Americans who vote. Such efforts should be encouraged rather than restricted.

Meanwhile, the Supreme Court produced a complicated decision on Wednesday regarding the recent redistricting effort in Texas. On one hand, the Court reaffirmed the continued need for provisions in the Voting Rights Act like those the rebelling lawmakers complain of, ruling that a redrawn district in southwest Texas unfairly diluted the votes of Latino voters. The Court ordered a new district drawn that would be more consistent with the tenets of the Voting Rights Act. However, the Court took no action to undo the highly-partisan redistricting that resulted in a net gain of four Republican seats in the Texas legislature, concluding that such gerrymandering of districts by party was constitutional.

The case in Texas is a mixed result that vividly illustrates both the continued necessity for protections like those found in the Voting Rights Act and the kinds of ways lawmakers continue to manipulate the election process.

In 2006, we look back at efforts such as poll taxes and literacy tests as blatant discrimination that ought to be condemned. Yet four decades after passage of the Voting Rights Act, lawmakers continue trying to manipulate the election process wherever they can find an opportunity, whether by campaign finance regulations, gerrymandered districts, restriction of voter registration drives, efforts to prevent ballots being available in foreign languages, expensive voter identification requirements, or even failing to renew some of the most important provisions of the legislation most useful in protecting Americans' right to vote. Such efforts taint our government, our elections, and our society and are no more acceptable today than they were in 1965.

June 23, 2006

Iraq and a Hard Place

Improbably, Iraq is once again emerging as a winner issue for Republicans for the fall elections.

Despite the fact that the majority of the American public believes it was a mistake to enter Iraq and despite the bungling by the Bush administration of nearly every aspect of the war, from the prewar intelligence on WMDs to the dismissal of estimates of required troop levels much higher than those actually deployed to the inadequate equipment for our soldiers to the infamously premature “Mission Accomplished” announcement to the similarly premature declaration that the insurgency was in its “last throes” and on and on and on…. Despite all of this evidence that Republicans have failed miserably in both rushing into a war and in executing it once there, Democrats continue to flounder and flail when attempting to present an alternative course for Iraq.

Some, like Senator John Kerry, are pressing for definite timetables to bring the troops home. While the wisdom of such a policy may be debatable, it is at least a coherent plan: The war is wrong. Bring the troops home.

Others are searching for a more compromised position, one that acknowledges both the perils of setting a fixed timetable and the need to begin transitioning American soldiers out of Iraq to grant the Iraqi government greater autonomy and responsibility. This policy better reflects the difficulties on the ground, but it is logically inconsistent. It requires a kind of doublethink – the war is wrong; don’t end the war – that does not easily capture many supporters.

Ever since many Senate Democrats supported the authorization to use force in Iraq, the party has been stuck trying to criticize a war it is at least partly responsible for getting us into. The results, unsurprisingly, have been less than stellar.

Meanwhile, Republicans happily step in and fill the vacuum with denunciations of plans like Mr. Kerry’s as “defeatism,” “surrender,” or “retreat.” Senator John McCain presents the options as a simple choice: “Withdraw and fail, or commit and succeed.” Of course, this “choice” ignores that we have been committed for over three years and success remains elusive (unless of course one judges success by Republican electoral victories). No Republican policy more definite than “stay the course” has been proposed to give hope that continued commitment will ultimately bring more success than our previous commitment has.

Still, it is the Democrats who are on the defensive regarding the war, reacting to Swift Boat-ish questions of their patriotism and dismissive characterizations of their policies as “cutting and running.”

What is needed for Iraq is not partisan jockeying where “leaders” search for the policy that will get members of their party elected rather than the right policy. What is needed is a coherent, agreeable vision of what we would like to leave Iraq looking like and a realistic assessment of the national sacrifices in manpower and resources required to get there. If we are not willing to make those necessary sacrifices, the troops must be brought home quickly.

To this point, the Democrats appear to be reacting to the national mood – voting for authorization of the war in 2003, criticizing those, like Howard Dean and Russ Feingold, who called for troop withdrawals long ago, but now pushing for an end to the operation – rather than promoting an independent policy on Iraq. So long as they are reacting rather than leading, the Republicans will be able to successfully portray Democrats as wavering and without a workable plan.

The Democrats hoped to stick the Republicans with the Iraq albatross come November, but despite repeated and continued Republican failures on the war, for now at least, it is the Democrats who remain stuck.

June 09, 2006

Supreme Court Shift?

In response to patterns leading to the rapid resegregation of public schools in cities across the country, many local school boards have enacted voluntary plans meant to help maintain diversity in their schools. Somewhat ironically, these plans have spawned lawsuits nearly identical to the cases brought in the 1950s and 60s by African American students seeking admission to schools racially segregated by law. Both sets of cases claim that making school assignments based on race violates the equal protection clause in the Constitution. The difference is that now the plaintiffs are white.

This week, the Supreme Court agreed to hear cases brought by white students in Louisville and Seattle who claim that policies in those cities unlawfully denied them admission to certain schools based on their race. Both plans were judged constitutionally permissible by the court below, but now the Supreme Court will get the final say.

Neither Seattle's nor Louisville's plan uses race as the only factor in making school assignments, but rather as one among many. In Louisville, parents are able to freely choose their children's schools so long as each school in the district maintains a minority enrollment between 15 and 50 percent. In Seattle, the plan requires school authorities to take a student's race into account as a "tiebreaker" only if that student is applying to a high school that already deviates by more than 15 percent from Seattle's systemwide racial balance. The most important thing to know is that both plans are voluntary -- not imposed by a court order -- and enacted for non-discriminatory purposes -- these districts have judged that it is in the best interest of all students to have schools that more accurately reflect the demographics of the broader communities the students will be a part of after graduation.

The Supreme Court's agreement to hear the cases was quite a surprise. In 2003, the Court ruled on similar issues regarding the use of race in admissions policies at the University of Michigan law school. In that 5-4 decision, written by now-retired Justice Sandra Day O'Connor, the Court upheld the law school's use of race as one factor in admitting students, though it warned that the days when such affirmative action would be permissible were not without limit. In the grade school context, the Court refused to hear a case from Lynn, Massachusetts, identical to that of Louisville and Seattle as recently as December. Why the shift?

The answer, of course, is that while the issues have not changed, the Court itself has. These cases mark the first opportunity for the court of Chief Justice John Roberts to define itself on such a big social issue. Whether the loss of the difference-splitting Justice O'Connor and her replacement with conservative jurist Justice Samuel Alito will result in a reversal on the affirmative action front remains to be seen. Justice O'Connor was still on the bench when the Court declined hearing the Lynn case.

Regardless of the outcome, however, the ideological lines in these cases reflect the enormous shift in the role of the equal protection clause in civil rights jurisprudence since the days when the first desegregation lawsuits were filed. Specifically, the idea of local control of schools has switched sides. During desegregation, conservative scholars, lawyers, and judges derided efforts by courts to tell local school authorities how to run their schools. Local control, it was argued, was a necessary characteristic of effective school administration. Now, conservatives find themselves looking to the courts for help, seeking court-ordered school admission for white students rather than allowing plans enacted by local authorities to stand. Meanwhile, liberals who once pushed for a broad and strong equal protection clause that could override local school decisions that violated it are stuck arguing for its limitation while insisting, as the conservatives once did, on leaving local school decisions to local authorities. How times have changed.

The outcome of these cases, expected to be heard late in the fall, will determine whether public schools across the country will be able to take measures meant to ensure diversity in their classrooms or whether the return to segregated schools, albeit de facto rather than de jure, will be accelerated. It would be unfortunate if after a half-century attempting to integrate public schools, we ended up right back where we started.