Politics

A Neighborhood Divided

The documentary Battle for Brooklyn, co-directed by Suki Hawley and Michael Galinsky,  follows community activist Daniel Goldstein as he fights to preserve his community in the face of the massive Atlantic Yards development that threatens to carve up Prospect Heights.  The proposed project would displace many lifelong residents as well as those new to the charming Brooklyn neighborhood. However, not all of Goldstein’s neighbors agree as, lured by the promise of affordable housing and jobs, many have sided with the developers.
Sarah Hanssen: As a native New Yorker, one of the things that really struck me as I watched the film was how divided the community seemed. Why do you think this is?
Michael Galinsky: The community was divided because the developer set out to divide them and did it very successfully.   The people divided will always be defeated and the developer did things like help create local groups to support the project, promising them jobs.  The people who supported the project got jobs but very few others did. The media was also very involved in this campaign, playing into the PR playbook and printing what they were asked to print.  There was no real reporting on the project so it was nearly impossible for people to get a sense of what was going on.

SH: Why did you devote so much of your own time to this project? How invested were you personally in the outcome of the Atlantic Yards development?

MG: As filmmakers we [Hawley and I] follow stories because that’s what we do.  As a neighborhood resident, I personally thought the project was a bad idea, but we tried not to be involved in the fight so that we could make a film that wasn’t a partisan attack.  That would have turned off the very people we wanted to reach with it.

SH: How did your own opinions impact the way in which you made this film?

Daniel Goldstein in Battle for Brooklyn. Photo by Tracy Collins.

MG: We tried really hard to make the film as even handed as possible.  At the same time, our main character was the leader of the fight against it.  As such, if the viewer gets to the end of the film and isn’t against the project then we have failed as storytellers, because then they aren’t with the main character.

SH: Has following this process changed your opinion of our legal system?

MG: We already were pretty suspect of politics and such, but this was a painful wake-up call to how deeply corrupt the system is.  The fact that the railyards could be handed over to the favored developer for a vastly lower sum was pretty mind boggling. It was a lesson in the harsh realities of power.

SH: This publication strives to support artists who are creating “the world that ought to be.” Even in some small way, how would you hope this film changes the world? How has making it changed you?

MG: Making the film changed us a great deal as members of our community.  We came to understand the ins and outs of local politics, and we came to understand the divisive power of power in regards to race and class.  The film has done a lot to galvanize communities all over the country who are facing similiar issues, and did a great deal to wake us up to the play book that is used to divide communities whenever those in power want to push something through.

SH: Finally, with so much injustice and so many neglected stories out there, how might you encourage either burgeoning activists or emerging filmmakers who might be starting out on a similar endeavor?

MG: Find a good story, stick to it, tell it and don’t give up when you can’t find support.  Questioning power is never a good way to get people in positions of power to help you out, so don’t expect help– just make work.

For screenings and more information, visit battleforbrooklyn.com.

Of My America

As a six-year-old you generally swallow whatever grown-ups are serving. Cynicism and skepticism have yet to develop to protect your child-brain, so you take in what you’re told, then reach for a toy.

I have fond memories of elementary school. In spite of the thin, high-collared women who towered disapprovingly over our desks, demanding conformity and encouraging uniformity, kindergarten through fifth grade were good years. We had frequent recesses, big playgrounds, and a comfortable routine. We said the Pledge of Allegiance to the flag in the classroom every morning. I was proud to be an American — mostly because I was told that my country was the best. That’s what kids dig: being the best. My earliest memory of being genuinely proud of my country is of sitting on the carpeted floor in the living room watching a black and white transmission from space as American men walked on the moon. Yep, the best.

Photo by flickr user aa7ae.

Now I’m 45 and I don’t give my allegiance so easily. Since the 1971 moon walk I’ve seen some things and learned some. I have another memory, on the same spot on that same carpeted floor watching that same TV. This time my mom was standing behind me, watching too. It was 1974 and a man named Nixon was quitting his job. I didn’t get it, but my mom seemed disturbed. Today, hardly a week goes by without some Congressman using the term patriot, traitor, or treason. Small, cheap words.

I can still say the Pledge of Allegiance. The words start a little staccato, but come back quickly. Warmth fills me when I say it; I like saying it. At first I resist, thinking that this warmth is deceptive, a sign of childhood indoctrination. But then my adult, rational brain reminds me of all that I’ve learned about my America since I was six. It stands for something. The stones that form the foundation of my America are still there, though they’re hard to see through the fog created by modern media and myopic leaders. History cannot be altered, only twisted.

Powerful words still echo from ancient Philadelphia — the words of Jefferson, Adams, Franklin, and their kin: Liberty, Equality, Justice, Self-determination, Self-government, Toleration, and Freedom of Religion. Even a cursory reading of the Declaration of Independence or the Bill of Rights will show that these are the ideas laid to oppose tyranny and establish the basis of the American Way.

It was this stake driven into the sand that became a torch, a beacon that drew millions as wave after wave of immigrants crossed the oceans hoping to raise their children in this land of opportunity.

As a six-year-old I did crafts depicting the American Melting Pot, under the instruction that America was better for being made of many different peoples becoming one — E Pluribus Unum and all that. I remember colored paper cut-outs of the Statue of Liberty, and hearing the teacher recite some words carved on a bronze plaque in the Statue’s base:

“‘Keep, ancient lands, your storied pomp!’ cries she

With silent lips. ‘Give me your tired, your poor,

Your huddled masses yearning to breathe free,

The wretched refuse of your teeming shore.

Send these, the homeless, tempest-tost to me,

I lift my lamp beside the golden door!'”

Now, at 45, I understand that these immigrants, my genetic ancestors, have made this country strong and fascinating. It was they who fueled the engines of industrialization and westward expansion, without which the Civil War and World Wars might have ended differently. They lived lives of determination, ingenuity, perseverance, and bravery as they pushed the frontier all the way back to the Pacific and into the ocean. They built railroads, ships, and factories. They turned wild prairie land into farms that feed the world. They were explorers, conquerors, and settlers. But they began as homeless and tempest-tost.

Today we have razor-wire to keep out the wretched refuse.

It’s time for Thanksgiving. As a child I learned that our ancestors were Pilgrims, fleeing hate and intolerance literally to the end of the earth. Almost 400 years ago, in 1620, a few harried souls rode the Mayflower across the Atlantic to find a place where they would be free to worship. They struggled and nearly starved. But the native Americans welcomed them and helped them survive. Those natives are also our ancestors, though we weren’t taught to consider them so. The fruit of the land was abundant. There was hardship, but harmony. A spiritual gulf separated the Pilgrims from the natives, but there was tolerance. And they were thankful.

One hundred and fifty years later, tolerance of diverse beliefs was the very first thing the Founding Fathers put into the Bill of Rights. The First Amendment begins, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof . . . ”

Today we profile the pilgrims.

Can we be patriots, having hidden the torch of liberty and covered the founding stones with baked bricks of national security, capitalism, and so many other -isms? Can a patriot be divorced from his heritage? Can we forget and still be us?

Kids still say the Pledge of Allegiance through fourth grade here. I don’t mind. I don’t mind because they will go on to learn of Adams, Jefferson, and Franklin. Of Liberty, Equality, Justice, Self-determination, Self-government, Toleration, and Freedom of Religion. They will read in our Declaration of Independence that unalienable rights extend to all people, not just citizens. They will understand the value of immigration and how the immigrants are us. They will be proud of the rugged, fiercely independent spirits that forsook all to make their way here and build something new, something that still stands as a beacon of hope to huddled masses. They will see the stones. Then, my America will become their America, and they will pledge their allegiance.

Taking Liberties

There are certain values and practices that Americans hold dear above all others. Somewhere near the top of that list is the boastful enjoyment of free speech and expression — one of the few of such values that is supposedly protected by the Constitution’s First Amendment, which reads, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

Sadly, having been whittled down to shavings by overly-critical legal interpretations and applications over the years, the ideals of the First Amendment have finally begun to collapse in upon themselves. Most recently, a Second Circuit U.S. Appeals Court ruled that the use of public schools in New York City for conducting religious worship services could be perceived to violate the Establishment Clause of the First Amendment, and therefore the Board of Education is within its constitutional right to prohibit the use of public schools for worship services outside of school hours. (Bronx Household of Faith, Robert Hall and Jack Roberts v. Board of Eduction of the City of New York and Community School District No. 10)

The first and most obvious problem here is that the language itself has been extended to meaning outside of what is reasonably implied by the words used. The Amendment clearly states that, “Congress shall make no law respecting an establishment of religion . . . ” (emphasis added) It does not say that Congress shall not allow something that expresses religion and indeed, such allowance would only validate the rest of the Amendment anyway, which says that Congress has no right to prohibit such expression or private establishment in the first place. Inasmuch as the use of a school might be perceived to be an endorsement by the government of a particular religion, more so could the prohibition of the use of such space for religious purposes be seen as a violation of the clause allowing the free expression of religion given that the space in question may be used for other non-academic, non-government-related activities outside of school hours.

What is interesting here, though, is that the court’s decision explicitly states that it is not restricting the free expression of religion or even religious activities, only that it is reasonable to restrict “religious worship services” from being conducted on school grounds. This splits hairs between the idea of viewpoint discrimination, which would be a constitutional violation, and content-based restriction, which is considered viewpoint-neutral. In summary, the court says that it is okay for people to assemble in a school building, sing hymns, hear Scripture read and taught, and even to pray, but these activities cannot be done in the context of a worship service because then they become exclusive, and exclusivity is considered viewpoint discrimination worthy of a content-based restriction.

Another question considered by the court is whether or not the use of a government-owned forum as a venue for worship is, in any way, an endorsement of any particular religion. The court says that yes, someone may reasonably perceive the use of a government facility for hosting religious worship as an endorsement of that particular faith. But the application of law should not be entirely or even primarily determined by potential perception of the situation, but only by interpretation of the law, which must be done in a much more legalistic manner than the court has done here. To interpret the language, “shall make no law respecting an establishment of religion” to mean, “shall be privy to no religious worship practices” takes exceptional liberty in determining what the original language was seeking to convey.

Furthermore, the government takes no pains to exclude certain religious material from other government-owned or -endorsed materials, all of which may be perceived to be governmental support for the Christian or Jewish faith. United States currency, for example, proclaims, “In God We Trust,” which I would think a far greater endorsement of religion than the use of a public school since not all public schools host church services, nor do they all brand themselves as subscribing to any particular spiritual entity, yet all currency pieces contain this proclamation. We can also consider the Oath of Allegiance for those wishing to become U.S. Citizens, which ends with the phrase, “so help me, God.” This phrase is now optional, but the fact that it exists in the default suggests endorsement because it says that if you do not choose to exclude God from your oath, then the government will choose for you to include Him, indicating a preference. On the other hand, the government has not chosen for students and teachers to attend school on Sundays when Christian worship takes place, but only during the week when classes are in session, which means that the government is actually endorsing education, not the establishment of a religious ceremony which may be attended by people even outside of the school district, i.e., beyond the establishment of the government. Finally, the courts themselves still use Christian Bibles when swearing in witnesses. Whether or not this is an option for witnesses — I confess, I do not know — remains a moot point; it is the default, the choice of the government, and therefore may be reasonably perceived to be an endorsement of the Christian faith by the courts because it implies that swearing on the Christian holy book is more consequential than swearing on any other book.

The court, though, has dodged all of these criticisms by saying that it is not the expression of religion on government settings that is prohibited, but specifically religious worship. The argument, again, is that the activity of religious worship includes religious expression, but the expressions themselves do not necessarily constitute worship services, which are exclusive, and therefore might violate the Establishment Clause of the First Amendment. I should think, however, that most Christians would disagree with the idea that the forms of religious expression protected by the law do not constitute worship, and since it seems that “worship” more than “services” is what the government takes issue with, I think this is an important point. While I may not conduct a worship “service” in my home when I pray, prayer is most certainly an act of “worship,” as are the acts of singing hymns, reading and teaching Scripture, etc. I cannot say unequivocally for other religions, but it seems to me that any time a person removes him- or herself from the secular in order to be engaged in the religious, it is an act of worship, not simply an expression of my faith. If I say that I believe in Jesus, I am expressing my religion, but once I begin to speak to Jesus, I am engaged in an act of worship, going far beyond a mere indication that I prefer Jesus to Mohammed, etc. Indeed, Dictionary.com offers one definition of “worship” simply as, “to feel an adoring reverence or regard for (any person or thing).” By this definition, the court’s decision suggests that any time I set foot on government property, the government may be perceived as endorsing a religion because I am always feeling an adoring reverence and regard for my Creator, and therefore perpetually engaged in an act of worship.

The court also said that schools are perfectly within their right to prohibit the conducting of worship activities because such prohibitions do not effectively impede the expression of love or reverence for those activities. The court uses martial arts and horseback riding, among others, as non-religious examples. These activities may be rightfully prohibited on government property because such prohibition seeks “the objective of avoiding either harm to persons or property, or liability, or a mess, which those activities may produce.” (Bronx Household of Faith v. NYC Board of Education) But then the practice of religious worship would have to threaten either persons or property, risk liability for said persons or property, or threaten to create more of a mess than, say, a school cafeteria during lunchtime. How to quantify such things is beyond me, but it seems reasonable to deduce that a religious worship service would be, in no way, more dangerous or messy than a typical day at a school full of children or adolescents, and while activities such as horseback-riding are prone to significantly impair the forum before and after the event, church services are not. Further, the purpose of the forum — to educate children, which the Supreme Court found to also include moral instruction — is not impaired by a church’s activities conducted outside of the forum’s standard hours anymore than they would be if a prom were to be held on school grounds despite the fact that some students and parents may morally disagree with the practice of dancing.

If it has been deemed permitted and protected by the U.S. Constitution for individuals or groups who are lawfully occupying government premises to express religious views on those premises, including the singing of hymns, the reading of and instruction in Scripture, acts of prayer, and other individual components of worship, then it is contradictory and even hypocritical to prohibit the actual worship service itself since it consists of no more than the individual components for which the government makes allowance. If the mere existence of a worship service in a school building can be seen as the government making a law to establish religion, then so should be seen the use of religious language on government money or in government courts when swearing in witnesses or receiving new citizens. If the issue at hand is only to avoid giving individuals the perception that the government has established a religion, then we cannot rely on the First Amendment at all because it doesn’t address the perceptions of the public, but only the actions of the government, and specifically, the legislation. Since no law was passed establishing a religion, yet a law threatens to be passed prohibiting a specific expression of religion, it seems logical to deduce that the government actually finds itself much closer to a First Amendment violation now than it was when churches were merely indulging in that same Amendment’s provision for the free expression of religion.

The freedoms of speech and religion were enacted to prohibit religious persecution that resulted in oppression and death, not to shield the people of this country from any viewpoint, religious or otherwise, that might be in contrast to their own traditional family values or personal opinion. Such a shield would only reinstate the very same idealistic tyranny that the early Americans first opposed and denounced, thus infringing upon, rather than furthering, the luxury of freedom we tend to thoughtlessly take for granted today.

Art in the White House

Do you read music? Of course you do; everyone does. Every time you listen to a piece of music, you are reading it. More accurately: you are reading into it, because there is no absolute music. If such music existed, it would be composed of sounds out of context, freed from associations with words, history, politics, or meanings. But music is always heard in context: listeners bring to it previous encounters, sensory memories, knowledge of its history or composer’s biography, and the weather, architecture, or society of the moment. No matter how abstract the composition—how divorced from text or images—there is never a moment when music is not interpreted. There are no purely musical pitches; all are flavored and scented with their passage through time and atmosphere. Nothing precedes interpretation.

If this is true, how much more so when the music is played in a political context (witness the recent Lang Lang debacle). Bach’s Mass in B minor, BWV 232, can serve as a little historical case study. This great composition (“great” is a reading; it was not always so valued) has an interesting history of tangled religion, economics, and politics. Bach wrote the “Kyrie” and “Gloria” in 1733 as a job application. He wanted to become court composer to Elector Friedrich August II—and sent religious piece (Sadie 799, Stolba 310). The letter accompanying these pieces is exquisite flattery:

To Your Royal Highness I submit in deepest devotion the present slight labor of that knowledge which I have achieved in musique, with the most wholly submissive prayer that Your Highness will look upon it with Most Gracious Eyes, according to Your Highness’s World-Famous Clemency and not according to the poor composition; and thus deign to take me under Your Most Mighty Protection. (Bach 128)

Apparently the Elector read this document favorably, because Bach got the job in 1736 and wrote some of his most dramatic work for this royal boss (Sadie 799). But Friedrich took him literally about the “poor composition.” This piece, along with almost everything else Bach wrote, was but little esteemed during his lifetime: “until about 1800 there was, in fact, almost nothing of the whole of Bach’s output in print” (Blume 30).

Why was Bach’s music, later to be ranked among the greatest in the “Western” world, relatively overlooked in his  own time? According to one interpretation, because of politics: “the powerful rise of national consciousness in the period of the Napoleonic Wars … taught [German people] to see in Bach the prototype of the German spirit in music” (Blume 37). In 1802, one Johann Nikolas Forkel wrote The Life, Art and Works of J. S. Bach. For patriotic admirers of genuine musical art. As if the title is not patriotic enough, he wrote: “Be proud of him, German fatherland… His works are an invaluable national patrimony with which no other nation has anything to be compared” (qtd. in Blume 38). Nationalism revived interest in the music of Bach; religion preserved it.

One religious denomination was almost single-handedly responsible for promoting Bach’s work in the “New World”: the Moravians. They took this quintessentially German work and translated it into an essential item in the American canon. In 1900, the Bethlehem Bach Choir of Pennsylvania gave the American premiere of the Mass in B Minor; eighteen years later, during their annual Bach festival, they also performed the Star Spangled Banner: “Mrs. Theodore Roosevelt and other national figures” attended the concert (Bach.org). In 1925, “the Choir [was] chosen, as the most representative of America’s musical organizations, to perform The Mass in B Minor in an Easter Concert …in the new Washington, D.C. Auditorium. The Choir [was] invited aboard the Presidential yacht and to the White House to be greeted by President and Mrs. Calvin Coolidge. The Choir [was] photographed in front of the White House” (Bach.org). And thus, by a series of beautiful mis- and re-interpretations, the German Bach ends up in the White House as a representation of American art.

There are many ways American presidents and first ladies, like the Roosevelts and Coolidges, have patronized the arts. Some of these are unofficial, but public: attending performances, visiting galleries, screening films before release, reading and talking about recent books, hosting arts-celebrities at social events. Some of these are official: purchasing and commissioning works, presenting awards, funding and conferring grants, underwriting public radio and television (at least for now). The White House itself, “the nation’s oldest important showcase for the performing arts” (Kirk xiv), is a museum and concert hall.

How do we read this? When a piece of music is commissioned by a Republican President, is it a Republican piece? When a sculpture is purchased by a Democratic Congress, does it become a Democratic sculpture? When a painting is hung in the White House, is it propaganda? If it stays in the White House when the administration changes, does it switch parties? Does art comment on policies, laws, and wars, or does art inhabit a politics-free zone? The answer to all of these questions is Yes. Art in a political context is open to political interpretation: but politics (and politicians) are also open to artistic interpretations. Instead of reading the White House’s arts (visual and auditory) as political documents, why not read the sum total of an administration’s artistic acts as yet another work of art?

It is easy to read the chronological collection of paintings in the White House politically: mostly landscapes and portraits, they are poster-children–or just posters?–for America’s beauty, size, diversity, and proverbial individualism (Kloss 14). For a long time, the quality of the paintings was of no concern; what mattered was that they presented idealized images of heroic presidents and vast panoramas. During the Kennedy administration, the collection came under professional supervision (Kloss 44, 46), but to this day “the collection remains unified by …art—as historical document, as decoration, and as vehicle for celebrating American values and achievements”; it is “documentation rather than art” (Kloss 23, 32).

But let’s turn that reading on its head. Instead of lamenting or criticizing the White House collection because it contains mediocrities, copies, and pastiches, why not imagine that the whole history of hanging art on its walls is a story—a well-planned narrative peopled by characters participating in an exciting aesthetic plot? They shape a nation; they act and write its legends; they picture it vividly; they appropriate the national composers of former enemies.

The current chapter of art-in-politics-as-art is no less interesting than previous ones. The present administration in Washington is well aware of the power of the arts. During his campaign, Obama’s image was broadcast on the iconic “HOPE” poster by notorious street artist Shepard Fairey. Capitalizing on the social energy of image, word, and music, the Obama White House has sponsored a remarkable variety of events, pre-packaged in speeches by the President and the First Lady. Interpreting their interpretation is an exciting exercise.

In their first year-and-a-half in the White House, Mr. and Mrs. Obama hosted a “White House Evening of Poetry, Music, and the Spoken Word,” a dance event, a Jazz studio, an evening of Broadway music, an evening of country music, a “Fiesta Latina,” a “White House Evening of Classical Music,” and “A Celebration of Music from the Civil Rights Movement.” Each event began with the President or the First Lady saying something like: “Today’s event exemplifies what I think the White House, the People’s House, should be about.  This is a place to honor America’s past, celebrate its present and create its future.” Then would follow comments about how the art form in question was uniquely American—country music because it tells “stories that are quintessentially American,” jazz as “America’s indigenous art form,” Broadway as the favorite tunes of New York City, and Bach?

At the White House Evening of Classical Music on November 4th, 2009, concert pianist Awadagin Pratt performed “his own offbeat arrangement” of Bach’s organ Passacaglia and Fugue in C minor, BWV 582 (Washington Post). How is Bach American? Perhaps because Pratt is African-American? Because Americans appreciate high culture, and Bach is sophisticated? Because Americans are omnivorous, and Bach is just another yummy dish? Or just because Pratt ended the piece by tagging on a little “Hail to the Chief”? (How dare he mess with the sacred works of saint Johann?!) One way to read this is that our administration doesn’t even know good music when they [don’t] hear it. The Washington Post thinks “The day’s message was, ‘Look, classical music can be fun,’ even though this message is also a tacit admission of the widespread assumption that it isn’t” (Midgette). Really? Are any of those the message?

A little music history provides another reading: a “Passacaglia” is “a continuous variation form” (Randel 611). Pratt was following in the tradition; he was improvising variations on Bach’s material. Bach himself was not above a little—or a lot—of brown-nosing. His Musical Offering “based on a theme given to Bach in 1747 for improvisation by that accomplished musical amateur, Fredrick the Great of Prussia” (Lipman 220). The King wrote the (boring) theme, and Bach wrote variations on it to toady to the king. Now the pianist Pratt gets invited to play at the White House and throws in his own little bit of butter-up-the-President. And the President throws in jazz, country, and Broadway to butter up the people. Or to be one of the people. Or to bring the people in. Or to condescend to the people. It’s impossible to say which, because it’s interpretation all the way down. But when this administration leaves office, its collective artistic statements will remain: a creative act of its own for future interpretation.

Sources
Bach, J. S. “Bach Asks Frederick Augustus II for a Court Title.” Dresden, July 27, 1733. In David, Hans T. and Arthur Mendel, eds. The Bach Reader: A Life of Johann Sebastian Bach in Letters and Documents. Revised edition. NY: W. W. Norton, 1966. Print.
Blume, Freidrich. Two Centuries of Bach: An Account of Changing Tastes. Trans. Stanley Godman. London: Oxford UP, 1950. Originally published in 1947 by Bärenreiter-Verlag, Kassel, as Johann Sebastian Bach im Wandel der Geschichte.
Hitchcock, H. Wiley. Music in the United States: A Historical Introduction. Englewood Cliffs, NJ: Prentice-Hall, 1989. Print.
Howard, John Tasker. Our American Music: A Comprehensive History from 1620 to the Present. 4th edition. NY: Thomas Y. Crowell Co, 1965. Print.
Kirk, Elise K. Music at the White House: A History of the American Spirit. U of IL Press, 1986. Print.
Kloss, William, Doreen Bolger, et al. Art in the White House: A Nation’s Pride. Washington, D. C.: White House Historical Association in cooperation with The National Geographic Society, 1992. Print.
Lipman, Samuel. Arguing for Music, Arguing for Culture: Essays. 1st ed. Boston: D.R. Godine in association with American Council for the Arts, 1990. Print.
Midgette, Anne. “Classical music has its day, albeit a muddled one, at the White House.” The Washington Post Thursday, November 5, 2009. Web. 10 Feb. 2011.
Palisca, Claude V. Baroque Music. Englewood Cliffs, NJ: Prentice-Hall, 1968. Print.
Randel, Don, ed. The New Harvard Dictionary of Music. Harvard UP, 1986. Print.
Sadie, Stanley, ed. The New Grove Dictionary of Music and Musicians. Vol. One. London: Macmillan, 1980. Print.
Stolba, K. Marie. The Development of Western Music: A History. 3rd edition. Boston: McGraw-Hill, 1998. Print.
Walters, Raymond. The Bethlehem Bach Choir: An Historical and Interpretative Sketch. Houghton Mifflin Co., 1918. Google Books. Web. 10 Feb 2011.
The White House: Office of the Press Secretary. Several press releases about White House events and transcripts of speeches from those events. www.whitehouse.gov. Web. 11 Feb 2011.

Thinking Historically

For the most part, I have a pretty good life as a grad student. Aside from time spent in class (and that crazy month right before seminar papers are due), I spend most days happily immersed in the writings of great scholars and statesmen of generations past, which means living a pleasantly hermit-like existence either in a magisterially peaceful, wood-paneled reading room at Union Theological Seminary, or sitting on my couch, curled up with a cat and a cup of coffee. But occasionally, I daydream about what else I could have done with my life — usually prompted by well-meaning jibes from relatives with MBAs who see no use for a PhD in English in the real world, or how I feel when I go online and my New York Times homepage brings news of turmoil in the social, political, and economic spheres, or even the shock I get at the contrast between my placid Morningside Heights neighborhood and some of the sketchier areas of town. On these occasions, the Ivory Tower feels like exile. I wonder guiltily about whether I buy my comfortable lifestyle at the expense of being uninvolved in the world around me, or whether I do so in order to avoid the responsibility of trying to improve it. Have I fled from the present world in order to escape into the past, and what account can I then give of myself to those who will inherit this world in the future?

In the past few months, I have found an unlikely model for thinking through these questions in Niccolo Machiavelli. I’m not sure how many other people in the world would admit that they want to follow Machiavelli’s lead in attempting to be socially engaged; now literally a by-word for bloodthirsty back-dealing and two-faced politics, we hear “Machiavellian” and think of classic stage Machiavels — Barabas, killing off a whole nunnery in Marlowe’s the Jew of Malta, or Shakespeare’s Richard III, putting the “murd’rous Machiavel” to school in his blood-soaked grab for the English throne — or its occasional use by psychologists to describe people with anti-social tendencies. Nevertheless, Machiavelli has proven a good guide for me not because of his unscrupulousness in the political present, but for his conscientiousness in approaching that present through the writers of the past.

While most people focus on the juicier catchphrases in The Prince that advocate ruthlessness and immoral political conniving, Machiavelli’s concern was not about morals as such at all. Rather, his lifework and writings were aimed at finding realistic ways of creating a stable, democratic political world. During the brief period in the early 1500’s when Florence operated as a republic, Machiavelli worked as a civil servant — in the second chancery and as a secretary to the intimidatingly named “Ten of War” committee — before the Medici family returned, took control of the government, and created a virtual oligarchy. Under the new regime, Machiavelli was confined to within 25 miles of the city limits, then imprisoned as a potential threat, and strappadoed six times (the legal limit was four and, bless him, he still refused to confess to anything) before being released. Unemployed and still under suspicion from the government, he exiled himself from the centers of political and social power in Florence and moved back to his family’s farm just outside the city, where he wrote all of his major political works. [1]

While there, Machiavelli wrote a letter to Francesco Vettori, the Florentine ambassador to Rome, asking for advice on how to use The Prince to regain entry into the political sphere, and it’s the tensions in this letter that haunt me as I wrestle with what it means to be involved simultaneously with the world of the political present and with the thinkers of the past. In order to take his mind off his present troubles, he says:

When evening comes, I go back home, and go to my study. On the threshold I take off my work clothes, covered in mud and filth, and put on the clothes an ambassador would wear. Decently dressed, I enter the ancient courts of rulers who have long since died. There I am warmly welcomed, and I feed on the only food I find nourishing, and was born to savor. I am not ashamed to talk to them, and to ask them to explain their actions. And they, out of kindness, answer me. Four hours go by without my feeling any anxiety. I forget every worry. I am no longer afraid of poverty, or frightened of death. I live entirely through them.

After so many years of anxiety and uncertainty, Machiavelli’s approach to the past seems to be about turning to a quiet, contemplative life where one forgets “every worry” of the present by living vicariously through those he studies. It’s a picture of comfort and solace to be sure, but where does the comfort come from, and how does one square these claims with his evident desire to be back in the thick of things?

The key for me to the meaning of his studies can be found in his adherence to humanism and its mode of civic engagement. Machiavelli, as any 10th-grade European history teacher will tell you, was one of the subscribers to the new humanistic learning that gained traction across Renaissance Europe, and that sought to revive the study of ancient and classical literature and history. The humanists believed that education in the classics formed effective citizens who would take part in the world around them, and who would use the wisdom of the ancients to refashion their political present. In this worldview, it is only by taking stock and contemplating the past that one can lead an active public life. Far from a turn away from the problems of the contemporary world, Machiavelli’s classical studies can only be understood as a whole-hearted embrace of the present and its problems.

In light of this, the comfort and the solace of studying the past comes, not from escapism, but from the capacity it gives us to deal wisely with our own time. Machiavelli likens studying the ancients to entering a foreign court, and casts himself as an ambassador, comparing policies and gaining knowledge and information to take back to his own realm. Studying the past gives us perspective on the present, allows us to see the scope and significance of contemporary sociopolitical events in light of the grand sweep of history, identify historical patterns and similarities by which we can predict outcomes and respond appropriately, and helps us acknowledge our own transience in the recognition that, just as early dilemmas were superceded by later problems, so also will ours be engulfed by those of the future. As Hannah Arendt, another great political philosopher, wrote earlier this century,

Life itself, limited by birth and death, is a boundary affair in that my worldly existence always forces me to take account of a past when I was not yet and a future when I shall be no more. Here the point is that whenever I transcend the limits of my own life span and begin to reflect on this past, judging it, and this future, forming projects of the will, thinking ceases to be a politically marginal activity. (Hannah Arendt, The Life of the Mind, pg. 413)

It certainly worked for Machiavelli. The immediacy of the republican crisis in Florence is now all but forgotten, but the judgments Machiavelli formed about it by studying the past have survived and continue to influence the way we think about politics in the European and American republican traditions, providing a means for us to conceptualize how politics work, now and in the future. And so with the image of Machiavelli before me, decently dressed and nourished by the ancients, I continue to study — not as a hermit, but an ambassador, reaching back to the past to get wisdom for the present.


[1] David Wootton, Machiavelli: Selected Political Writings, Indianapolis: Hackett Publishing, 1994.

Is Laughter the Best Medicine?

It dawned on me the other day that I don’t really listen to the news, I laugh at it. I go out of my way to relish in the absurdity of the 24-hour news cycle and the stupidity of celebrities and politicians whom I wouldn’t give two cents about except for the fact that I can laugh at them. Other than listening to NPR in the mornings, the four main sources of news in my life are The Daily Show, The Colbert Report, Wait Wait…Don’t Tell Me, and the Relevant Magazine podcast. All of those shows make really good efforts to make fun of the news and point out the absurdity and irony of our media-driven culture.

Laughing at reality is a big business. Jon Stewart and Stephen Colbert are cultural icons in their own right, and the radio program Wait Wait…Don’t Tell Me attracts nearly three million listeners a week. It’s such a big business that Fox News even made an ill-fated attempt at a news satire show a la The Daily Show with a conservative slant called The Half-Hour News Hour (it aired seventeen episodes before getting axed).

The old adage is that laughter is the best medicine, and some kind of medicine is needed in a world where one of the biggest TV shows in the country follows the misadventures of a bunch of beach-loving, hard-partying Italian-Americans stuck in arrested development. Laughter seems to take the edge off the world for those who want to cut through the mass-marketing of plastic trinkets and overwhelming cultural garbage of reality shows and politicians scheming about spending billions (or trillions). Laughing at the news every few days is an act of cultural catharsis, removing the stench of our world’s stupidity by laughing it away. But is a medicine that causes you to purge and forget, to just laugh the world’s wrongs away like you laugh it off after you fall out of a chair, really the best medicine?

Cleansing oneself when dirty or sick is often necessary, but to do it too much, to turn a medicine into an addiction, is to begin to treat laughter as a coping mechanism — it gets you by until you need your next hit of laughter. There is a lot of valid talk about the vitriol in the political conversation in our country lately. People are using anger to cope, and the anger of Glenn Beck or Keith Olbermann feeds people like a drug. It makes people want to change the world by raging against the machine, breaking stuff, or forming militias. Laughter seems like the better option when compared to anger, but I’ve begun to find all the laughter is making me apathetic. Instead of being mad at the world, I’m being conditioned to think the world is all one big Camus novel and there’s really no point to the madness, so just laugh.

There is a way out of the mire, though, and it’s by taking a long view of culture.  Our fascination with the present compounds our tendency to react with anger or laughter. We crave real-time information from Twitter, 24-hour cable news, and our smartphones. When inundated with information, we focus on the present so much that we forget the harsh fact that what we’re arguing about is probably not even going to be significant in a year or two. The world only appears to be a big Camus novel if we never look to the future with any sense of calm or hope. If we take a breath and think about the long-term impact of what’s going on in the world, what really matters in life will rise to the top, and that certainly won’t be contemporary political drama or the latest celebrity gossip. Our human nature drives us to instantly react, but deciding to take a more rational approach, to not become angry or apathetic, but to live a quiet and peaceable life, gives us a more grounded outlook on the world.

Laughter has its place, and I’m not going to stop watching Jon Stewart or Stephen Colbert, but what’s more important is to begin to cultivate a view of the world that is not stuck in the ever-changing present. It’s just too much to keep up with, and the disequilibrium caused by all that information is what provokes anger and apathy in the first place. The corruption, evil, suffering, and stupidity of this world are enough to make us grow mad at the world or shut down rather quickly.  Instead, when anger begins to simmer or apathy begins to choke our desire for a new and better world, we should remind ourselves that there is truly nothing new under the sun. The world will always have problems. We can choose to get angry about it. We can choose to laugh at it. But the world would be a far better place if every once in a while we got up off the sofa or put down the picket sign and did something about it in a selfless and radical way.

Cruel and Usual

 

Ever since the institution began, and certainly since the 1970s, the American death penalty has been an object of insatiable scrutiny in the criminal justice system of the West. Europe is appalled that we still have it. The Middle East is appalled that we don’t use it more frequently. In some states it’s non-existent, others it’s little more than a myth, and there are still some that can’t seem to get enough of it. (Yes, I’m looking at you, Texas.) So the debate will go on until the unlikely day when the federal government abolishes executions altogether.

Yet even while the fires of the capital punishment debate show no signs of cooling, a recent Supreme Court ruling has started afresh a new debate, rooted in the same constitutional criticism as execution-abolition. With executions on the decline while recidivism has been inching its way up the charts over nearly three decades, those lovable lefties have taken up the faithful arms of that pesky Eighth Amendment once more in order to propel the next Great Debate: life imprisonment for minors.

The Eighth Amendment states, “Excessive bail shall not be required, nor excessive fines be imposed, nor cruel and unusual punishments inflicted.” That’s it. Seventeen simple, highly interpretable words, upon which universalists, liberals, and abolitionists have stood tall and proud on ethical, moral and political soapboxes to proclaim all that is wrong with the punitive branch of our justice system, particularly when it comes to the death penalty. For some, execution of any sort is seen as cruel and unusual, though it is, ironically, one of the most consistent forms of punishment throughout history, which surely excludes it from being unusual. Then there are the conditionalists who insist that only some forms of execution are cruel and unusual, as though we might be able to convince the condemned — or even ourselves — that we really do care for their well-being if we poison them instead of bludgeoning them to death; firing squads are mean, but hanging is okay; gas chambers leave a bad political aftertaste, but electrocution gets a majority thumbs-up. Still yet there are the legalists who rightly point out that the certainty of someone’s guilt is rarely substantial enough to take his or her life — perhaps the most tolerable and certainly the most logical of the arguments. And then at the farthest liberal end, the place where idealism trumps truth, there are those whose only wobbly leg to stand on is the one that says everyone deserves a second chance. But while an unstable footing may be enough to prop up the Eighth Amendment against death, it only touts social idealism and naivety when positioned against the argument of life in prison.

The case highlighted here is that of Graham v. Florida in which Terrence Graham, a minor at the time, was given a plea deal to avoid a guilty judgment in an alleged armed robbery. One of the terms of that deal was a probationary period, which he allegedly violated, sending him back to court for adjudication for the original robbery. At that time he was found guilty and sentenced to life in prison.

The argument, which the Supreme Court upheld, was rooted in the Eighth Amendment’s prohibition of cruel and unusual punishment, and cites a series of other cases in both recent and not-so-recent history which have set precedent to define what is “cruel and unusual”. Without getting into the nitty gritty, the Court’s majority opinion is summed up by Justice Kennedy, who argues that a minor should have an opportunity to change. He writes, “Life in prison without the possibility of parole gives no chance for fulfillment outside prison walls, no chance for reconciliation with society, no hope.” This, he says, makes the punishment both cruel and unusual.

But Justice Kennedy is operating on an idealist principle which says that the prison system is designed for reform rather than the truth, which is that prison has much more to do with punishment. For years the criminal justice system has been trumpeting to the media about incarceration’s rehabilitative qualities– how it shouldn’t be seen as an entirely punitive measure, that there is much more to it than locking them up and throwing away the key. Sadly, Justice Kennedy, a would-be conservative who can’t seem to stop drinking the liberal draught, has enthusiastically pledged to sing along.

The truth is, though, that no matter how many educational programs, social workers, religious institutions, or other rehabilitative measures are put into place within prison walls, the system itself will continue to keep itself in business as long as it continues to put the problem children together on the playground without supervision. Indeed, such a metaphor breeds a sense of irony because it is exactly in the school system where we see a similar sociological phenomenon. Take children even from well-to-do families and put them in the best educational institutions around, but the ones who have a penchant for trouble will not only find it, but they will find each other, and from these associations they will often go on to break more rules than they would have had they never met.

Prison is exponentially worse because it only houses the troublesome ones; strictly speaking, there are no “good” social influences. There is frequently street or even gang mentality in prison: demand respect by instilling fear even if it means resorting to violence; the weak will cling to the strong in order to protect themselves, and any opposition perpetually risks life and limb.

Even outside of violence, in the regular day-to-day of prison life, social interactions will, if innocently in the beginning, veer down the wrong path. Inmates will surely make small talk as humans are wont to do, except unlike the world outside prison walls, no one is going to start a conversation with, “So, what do you do for work?” Clearly, nothing anymore. The more natural icebreaker becomes the Hollywood favorite, “So, what are you in for?”

I bring up the obvious to point out the subtle: inmates frequently talk about crime. For a few, it’s all they know. And given the choice between slowly muddling through high school equivalent education or anger management courses, teaching inmates theories with little hope of opportunity for application, or learning from one another about how to get further, faster, the majority tend to sway towards the latter, thus perpetuating the very criminal mentality the system claims to be reforming. So when the “second chance” comes around, ex-offenders become re-offenders, recidivism rates hover at a staggering two-thirds for re-arrest and fifty percent for re-incarceration (so much for rehabilitation), and criminals find themselves right back in the over-crowded system that has already failed them once.

Herein lies the true violation of the Eighth Amendment. To merely prohibit life imprisonment for a minor only looks good politically. But practically, when that minor is released from prison in twenty or even ten years, he’s still going to have a long, uncertain — and yes, frightening — road ahead of him. He has learned only how to function in a unique population subset with no real understanding of how the world outside is working. (Think how much society changes in ten years, let alone twenty or more.) To send him back out into that now-unknown world with fifty dollars, no identification, and a list of homeless shelters to be turned away from is far crueler (though I’m afraid not very unusual) than to keep him in prison for the rest of his life. Even in the cases of ex-offenders being released to family and friends, to do so without further guidance than a weekly tousling with parole officers (who, often times, are ill-equipped themselves to deal with the trials of the parolee’s societal reintegration) is to set them up for failure. Well-intentioned as family and friends often are, they are just as often unable to shoulder the burden reintegration presents, and perhaps more often become part of the problem.

In fairness, it isn’t the High Court’s job to create new laws, only to uphold or strike down the rulings of lower courts. But as long as legal precedent will be the result of the Court’s decision, it would behoove the system to take further action. If Justice Kennedy and his liberal cronies want to make a real difference in the justice system, they should have a few conversations with their buddies in legislation about how we can provide the rehabilitative services offered in prison post-incarceration, rather than piously denouncing one punishment as unconstitutional while the alternative is hardly better and possibly worse. With the billions of dollars the federal government pumps into policies governing education for those who already have it, money for those who should share more of it, and wars that should be dwindling down instead of revving up, surely there can be some reallocation towards reintegration, among other things. Then, and only then, will we be able to adhere to the principles and intentions of the Eighth Amendment while simultaneously moving one step closer to providing some of those in need with a second chance that may actually have the sustenance to bear the fruit the system presently pretends to grow.

The Pros of Disputation

If you are standing in the school cafeteria with someone on the debate team and you make some ridiculously subjective statement like, “I really feel like eating roast beef today,” the debater will clutch you by your windpipe and cite evidence – using rhetorical markers like Point 1 and sub-point A – saying that according to the FDA’s dietary guidelines from 2005, roast chicken is healthier than roast beef besides being more tender and a better complement to the soggy green beans on the menu. The debater will demand that you rebut him on the spot. If you say simply, “Um I like roast beef and you can like chicken,” it will utterly befuddle him.

That’s because debate is arguing for the pure joy, passion and zeal of argumentation. Clearly, this draws a certain kind of personality. To a true, cross-ex-in-the-bones debater, disputation is simply conversation. The activity of competitive debate is good for him and all of us. When he unleashes his innate disputatiousness in the school cafeteria, he’s annoying. But in the tightly structured forum of debate where his speeches have time limits and there are moments he has to shut up, that orneriness is tamed and polished. He becomes the kind of arguer the world could use more of.

To promote his memoir Wisenheimer: A Childhood Subject to Debate, Mark Oppenheimer held a debate at his high school alma mater Regis High School in New York City. Oppenheimer and a high school senior debater, Joseph Eddy, faced off against journalist Hanna Rosin and Stuvyesant High School debater Claire Littlefield, debating the resolution, “Is American political dialogue in trouble?”

I walked in the room completely prejudiced. How could anyone not believe that American political dialogue was in trouble? Clicking Twitter headlines from my couch, I cringe at the outrageous statements of entire swaths of the American population. How, I scoffed, could anyone NOT believe that American political dialogue is clearly doomed?

But as the debaters stood and gave their opening arguments, I found myself falling back into an  old pattern from back in my debate days: the tabula rasa. This means your mind becomes a blank slate – all of your preconceptions and prejudices erased, your mind wiped clean, and ready to judge each argument on its merit and evidence.

The men, affirming  that the American dialogue was in trouble, gave their case, and the women argued for the negative side against it. They sparred in cross-examinations and gave rebuttals. As Rosin and Littlefield spoke, I found myself switching allegiances. The aff was so obvious, the neg more nimble. The neg had a harder premise to prove and they were creative and spirited in the way they constructed their case. They were more clever about their arguments. Proving their side took more finesse.

Besides that, I almost always feel inclined to pull for female debaters over male. Being a female debater takes poise and a willingness to go toe-to-toe with men who are bigger than you. When I was a 98-pound little-voiced slip of a thing, I debated oafs who were six feet tall. Their voices commanded the room and their figures dwarfed mine when we got up for cross-ex.  Female debaters face what female candidates and female CEOs do: the perennial prejudice that audiences will view spirit as shrillness and confidence as masculinity. Female debaters must root for one another to succeed.

And yet as the debate progressed and I followed each argument, charting its flow (yes, we call it a “flow chart” in debate terminology), the debate came down to a single question for me. Please remember that I am paraphrasing and both might quibble that I’m omitting certain nuances, but the gist of the argument was this:

The women were arguing that the fringe elements of the American dialogue – the Glenn Becks and Rush Limbaughs and Keith Olbermanns – were actually good for American dialogue. They admitted the fringe was fringe but said that despite all that, the fringe was a vital part of American dialogue and denying them their distasteful opinions would be un-American.

The aff argued that this made no sense. If American dialogue is dominated by extremists who care about partisan politics over truth, then how could this possibly be a good thing? If the neg conceded the premise that wingnuts dominated the debate, could they still argue that American political dialogue was just fine? If they admitted extremists were loudest, did they have an argument for why it was good that extremists were loud? I went from one side to the other and back again as I watched the debate. And finally, the question came down not to my personal prejudices but to the question, “Who made the argument the other side didn’t answer fully?” I didn’t hear a satisfying argument for the virtues of wingnuts. If I had been casting a ballot I would have said yes, the American political dialogue is in trouble.

It would be in less trouble if leaders had to debate like those students did. In high school debate, you take turns debating each side of an issue. Imagine if Congress had to abide by the rules of high school debate. Imagine if we presented them with the health care bill and told them they had to switch sides. The Democrats had to come up with the best, most sophisticated arguments against health care and the Republicans had to passionately defend the health care bill against all arguments against it. Whoever won the debate got to choose the outcome of the bill. If Republicans won, they could kill the bill; if Democrats won, it lived.

We would have a different kind of debate I imagine. For one thing, it would be more objective. Each side would be arguing for the side they abhorred, but for the sake of winning for the side they loved. It would cool the heat that comes not from facts but from resentment and fear and pandering. The debate is now about evidence and arguments, not about how you feel.

Doing this helps you see the other side. It helps you see where your own arguments are weakest when you find yourself skewering the argument you believe and questioning the evidence you personally find credible. You discard arguments that are bad and find evidence that has weight.

Afterwards it was said that if students like Eddy and Littlefield were American’s political future, then American dialogue would one day prosper. It was true.

Go Ahead, Change My Mind

Over the last couple of weeks I’ve been teaching arguing, or persuasive, essays in my freshmen composition courses. I save this format, along with evaluating essays, for last because in some sense they utilize all the skills that the students have picked up through the practice of writing, remembering, observing, and explaining essays in the previous weeks and months.

That’s one reason why I save these types for last, and it’s the better reason. The other reason is that I know from experience that it’s hard to keep everybody’s attention – the students’ and my own – focused in the waning weeks of a semester, particularly a spring semester when the weather is warming and summer vacation is on the horizon, and arguing and evaluating essays are my, and often my students’, favorite types to read and write.

I assign several readings for each essay form, chosen because they are prime examples of the particular type and though they often change, Martin Luther King, Jr.’s “Letter from Birmingham Jail” has a permanent home in the arguing section of my syllabus. For those readers who have not sat through a semester of class with me (or any countless other composition courses that utilize King’s classic text): it is an epistle written while King was in the city jail in Birmingham, Alabama after being arrested for taking part in non-violent protests there. The letter is a response to “A Call For Unity,” a statement published by eight white Alabama clergymen in which they conceded that injustices were taking place, but that protest, even non-violent, was not appropriate and that proper, legal means should be pursued.

King states his case in no less than nine points (he even apologizes at the end for writing so much and makes reference to the fact that there’s not much else to do when one is in prison). There is no doubt in my mind that this is one of the greatest arguing essays ever written, offering some of the most airtight arguments ever made. The now famous line, “Injustice anywhere is a threat to justice everywhere,” appears in this text.

What I have never known about this essay, and still to this day cannot say for certain, is what effect this text had first on its intended audience, the signatories of “A Call for Unity.” I know that every time I read it I get chills and that most of my students come to venerate it, but as far as I can tell from my admittedly very limited research on the topic (Google “Letter from Birmingham Jail” and, most unfortunately, over 95% of the results are for free term papers on the essay) there is not much written about whether or not it “worked.”

Now, of course, to a certain extent it did work, as did MLK’s social action, speeches, and sadly, his death, in addition to the work of countless other civil rights advocates, but whenever I think specifically about the impact of “Letter from Birmingham Jail,” I can’t help but wonder if it actually changed any of the clergymen’s minds, or had a life-altering affect on any residents of Birmingham.

And I wonder about this in regard to many arguments I hear made, debates I witness, and apologists I read. Do all these words, all this time spent building a case, ever actually work to convince somebody that the position that they hold is wrong and that they should exchange it for another, more correct stance?

And yet, I know that people do change their minds. I don’t know how I would describe myself politically prior to 2001, but I know that whatever it was (must’ve been somewhere between far right and right of the center, as that’s where my parents, church, and educators were coming from), by 2002, my views were very different from those of the people that had an influence on me in my youth. I can point to a few definitive books I read (Franny and Zooey, On the Road . . . yeah, I know), and some very important people I met, conversations I had, and things I experienced (studying in Kenya was big in this regard), but I can’t point a finger to any one thing as the straw that broke the camel’s back.

I’ve come to realize that, for the most part, I’m of two minds on this. On the one hand, semester after semester I make my students read and write arguing essays. And then I evaluate them on the clarity of their writing, certainly, but also on their ability to form a cohesive point and defend it. I teach them about three different kinds of arguments: traditional (I’m right, you’re wrong), constructive (I’m right, and I want to help you see why you’re wrong), and Rogerian (I’m right, but you may also be right, let’s compromise), but more often than not they choose the traditional style. And I’ve had some good writers over the years, but not once has one of them convinced me of anything I didn’t already believe. Nobody wins the argument, yet I still make each student do it.

Why do I continue to believe that learning how to make an effective argument is important when I really think it’s not a good argument that changes a person’s mind but a series of events, experiences, and lessons learned? Perhaps some of those influential books, essays, and stories that I read back in 2002 were argumentative in nature, and certainly my views grew more nuanced and I became more certain of what I was coming to believe through arguments, but not to the point where I feel comfortable saying an argument changed my mind.

It’s a scary thing to change one’s mind, to admit that the beliefs and values one clings to may not be as deeply held as once thought. And for a person so often prideful as I am, it is also a deeply humbling experience to reevaluate and to be found wrong. I know this is the case not simply based on 2002, or even on any of the hundreds of minor changes and course corrections that I’ve made in the years since, but because I fear it may be happening again.

I’m halfway through famed (infamous?) evangelical author Brian D. McLaren’s latest offering, A New Kind of Christianity. The book has attracted a lot of attention, mostly because of the overwhelming wave of extremely negative reviews it is garnering from other evangelicals. I’ve read McLaren before but, based on some of the commentary I’d heard about this book, even I approached it with some trepidation, with a bit of fear that he may have gone too far.

McLaren’s book is an argument, an apologetic. If I had to classify it for my class I would say it’s somewhere between a traditional and a constructive argument. The details of his case for a new kind of Christianity are the subject for a different sort of essay, but suffice it to say, his argument is made well. His points are clear and rational and, most importantly in an arguing essay, he appeals to what the reader may have already thought or believed though may never have given voice to.

This is a tactic I encourage my students to use, one that Martin Luther King, Jr. used miraculously. It involves knowing your audience and making an appeal to them that is both respectful and transformative. McLaren knows me. Like King knew his fellow clergymen, McLaren knows his left-leaning evangelical.

I can’t say where this will all end up, or where I’ll be when the pieces land. But I can say that I’m beginning to believe more fully in the power of the arguing essay. I can say that I’m beginning to change my mind.

Reblog this post [with Zemanta]

The Message is the T-Shirt

When I was very young, my aunt and uncle gave me a ponderous elephant pendant necklace on a heavy silver chain. It was a necklace befitting a 45-year-old portly professional – the kind of necklace that would go well with an expansive plush suit in a matronly hue.

“We thought of you when we saw this,” they said. I looked at it and said, “Thanks.”

They said this because I made everyone think about elephants. I brought elephants to the mind because I wore shirts that said “W. for President” and had a red-white-and-blue George W. Bush campaign tote bag. I had a collection of small elephant figurines because elderly relatives kept buying them and saying, “We saw this and thought of you.” I was the kind of child who walked her precinct during Republican primaries and attended state Republican party conventions on weekends. I woke up at 8:00 on Saturday morning to attend county GOP meetings. I was accompanied to these meetings by frail old Republican women who wore tapestry suits woven with elephant patterns and dangly elephant earrings. By anyone’s account it was my destiny to one day become a frail old Republican woman in an elephant-patterned suit, in which case the pendulous necklace would serve my wardrobe well.

I did not become that woman, but I have never – even in seasons of political ambivalence – stopped wearing political t-shirts. When a friend of mine said the other day that she would have nowhere to wear a political t-shirt, it startled me. To me, the only wrong place to wear a political t-shirt is church.

In 2008 I was, for the first time, an undecided voter. Never mind the journey that took me from George W. Bush tote bags to a crisis of political faith, but for the first time I felt myself pulled in two different directions. At first I decided not to vote at all, just for the principle of the thing – because it seemed unfair that I should have to choose between so many principles I held equally. But then one bright Sunday I walked through Union Square, which was brimming with campaign regalia from New York’s hippest artists. I could have bought twenty fashionista political t-shirts but my eye lit upon a light blue one with darker blue lettering that said “Blondes for …” Well, I’ll let you guess.

 

It was perfect. It said, “I am a blonde and I am my own special interest group, like lesbian Latinas or gun-toting Irishmen. This is my vote and while I am confident enough to advertise my vote on my boobs, there is a part of me that realizes if I have chosen wrongly it won’t be the end of the world; but still I am actually making my choice.”

Or maybe I just thought it was cute and I wanted to buy it as a companion for my “Blondes not Bombs” t-shirt. But I bought it – and the moment of buying the t-shirt and the moment of final decision were almost one and the same. My friend said, “Well I guess you’ve made up your mind then.” And I realized I had.

I wear political t-shirts both to make friends and make enemies. It’s my way of stubbornly standing up for myself when I feel stifled, and finding out who’s standing with me. I bought a t-shirt from Brooklyn Industries that showed Sarah Palin crowning a beatifically smiling Hillary Clinton Miss America. The artistry was ambiguous. (Hillary Clinton was hotter on the t-shirt than she was in real life). The message was somewhat ambiguous, too: Was Sarah Palin crowning Hillary Clinton the next woman in the White House because Palin had already won the White House? Or was Palin ceding First Female President to Hillary Clinton? I gave it my own interpretation. I bought it, loved it, got into arguments over it and lost it when I went to a primarily Republican wedding in Ohio – a memory that still makes me bitter as I search eBay for a replacement I have not yet been able to find.

Sometimes I like to buy my t-shirts a little to the left or right of where I actually am. My latest acquisition is a little pink vintage number that says, “Vote Democrat: A clean sweep.” I am not a Democrat, but I wear it to be a little perverse when I meet up with friends who campaign for Scott Brown. I want a t-shirt with a Jimmy Carter slogan of a grinning peanut, but Jimmy Carter is so lame that I’m torn. Perhaps a McGovern t-shirt with a dove of peace instead: obscure enough that pretty much no one will get it but relevant to today.

I wore that “Blondes” shirt right up until and on Election Day. Campaigners loved it. Elderly black women loved it. A boy staggering drunkenly through the West Village on Election Night also loved it. It’s ratty now but I still wear it to the gym, where nobody comments on it anymore. The big 2008 moment has passed. The hope is all tired and worn out – like my shirt – and no one will care to wear political t-shirts until 2012. Except me.

Whatever Happened to Due Process?

There was a time in my life when I regularly exercised a very reckless lack of judgment. During that time, I decided that the most satisfying future I could pursue would be in the world of law. Since I was transferring schools anyway—more reckless judgment—I jumped at the opportunity to change majors as well. Armed with a stubborn persistence and what I interpreted to be omniscience, I set off to change the world through the fisheye lens of the criminal justice system.

As it turns out, cynical people like me don’t really find much reception in the justice system. (I know. I was surprised, too.) But as I took my first steps into the world of justice, I found it difficult to be any other way. How was it that the United States of America, arguably at the helm of the greatest justice system in the world, could still see so much corruption, so much frivolity? How were men and women dodging murder verdicts based on trial technicalities while I couldn’t even get out of a speeding ticket? Something had gone unquestionably awry.

Yet much of the corruption is subtler than it seems. Indeed, while some legislation has evolved into ludicrous formality, there is no doubt that it once had roots in the protection of human rights.

Take, for example, the Constitutional right to due process. Any person tried on American soil is entitled to a trial and cannot be deprived of “life, liberty or property without due process of law.” But entitlement should not be the same thing as requirement and due process of law should not mean fruitless formalities, both things that the State of Arkansas ought to consider in the case of Abdul Hakim Muhammad. Muhammad is accused of killing one military private and injuring another in a shooting at a recruiting center in Little Rock, Arkansas last June. Once in custody, Muhammad said that he wanted to plead guilty, citing religious reasons for his actions. Arkansas, however, does not allow a suspect to plead guilty to a capital crime. Informed of this, Muhammad thought he was being led astray and most recently wrote a letter to the judge, bypassing his attorneys, stating that he is guilty, he wants no trial, and he stands by his actions as an act of jihad. So far, Muhammad must still plead not guilty and be tried for his crimes.

The overarching theme of what’s going on here is that the justice system—driven by an increasingly corrupt world of politics—is focusing less on discovering truth and serving justice and more focused on the political and social ramifications of its actions; in other words, the system is now riddled with laws which function more as clauses to cover the State than they do as statutes to protect human rights.

Forcing Muhammad through a trial brings up a number of major concerns, not least of which is that he could very well be found not guilty. The logical man would say that doesn’t seem possible, but the justice system no longer operates on logic but on politics and in this case, politics says that there may be technicalities in the time leading up to trial where a jury legally cannot convict Muhammad. These technicalities once acted as protection against human rights, but they have been corrupted largely by idealist defense attorneys who treat legal proceedings like a philosophy class where semantics hold more sway than truth. To people such as these, criminal justice is comparable to a high school debate team and the result is that some criminals who deserve to be punished are walking the streets which American people otherwise believe to be safe.

Admittedly, though, if Muhammad goes to trial, he likely will be found guilty, which brings up a different concern: tax dollars. If it isn’t already bad enough that the people of any state have to foot the housing bill for convicted felons, it is downright unthinkable to require the people to pay for a trial for a man who is happy to confess, entirely apart from duress, and accept the penalty. The cost of a pointless trial on top of the cost of even a single year of holding a convicted felon in a maximum security prison or death row tops out at hundreds of thousands of dollars. At the risk of sounding callous, why spend more than we have to?

The root of the answer probably comes from a deep history of coerced confessions and botched trials. With DNA evidence rescuing hundreds if not thousands of people from life and death sentences, cost considerations carry less weight as mitigating factors when the possibility of an innocent man paying for a crime he didn’t commit remains. But to go so far as to prohibit a guilty plea can only be relevant when there remains a very distinct question of truth, such as when a man confesses but then maintains his innocence later on, as in as the tragic case of Amanda Knox. But while Knox’s confession may have been coerced and was certainly retracted, Muhammad has all but boasted of his guilt, and he’s continued to do so for seven months. Under such circumstances, it seems that Arkansas’ law needs a bit of tweaking.

The law, however, is unlikely to be tweaked, because whether it’s wasting resources or truly saving innocent lives, it covers the government’s back, which seems more the more likely interest for the State in the first place. Rejecting a guilty plea in a capital case proclaims that Arkansas will not see any man martyred, whether for religious reasons or otherwise. Rejecting a guilty plea from a self-professed Muslim extremist tells the world that America gives even radical religious zealots a fair shot. There’s no religious bigotry here, no animal bloodletting. Just good, clean, criminal justice protocol.

The truth is that the protocol is not clean and it has very little to do with justice. States are endlessly embattled in a similar struggle when it comes to the death penalty, as the states which still execute inmates seek to prove to opponents that there is somehow a method of taking another man’s life which doesn’t amount to cruel and unusual punishment. Granted, that while the Constitution remains as interpretable as the Bible, there are certain words and phrases in the Bill of Rights that simply don’t leave much room for evaluation. Killing a man by its very nature is cruel and unusual. Waiving your right to a trial is, by its very nature, due process of law, as long as the accused has been given the right in the first place. Muhammad obviously was and if he’d rather not be tried, if he’d rather simply confess and go to the gallows, then it should not be the burden of the people to see his way through the system simply so the State of Arkansas can boast a clean conscience.

The criminal justice system was designed to discover truth. Instead, it has become a place of political struggle where too many lawyers care too much about the game, too many judges care too much about appointments, and too many governors and legislators are more interested in appearing compassionate when the system they work for is still based in punishment, not rehabilitation. If Americans want to be the nice guys then we should do away with prison altogether and find a way to help offenders become functioning members of society, not force unwanted trials on criminal suspects, burdening already-strained American citizens in the process. Those who make their beds with determination to lie in them should be allowed to do so. The criminal justice system has plenty of other problems to deal with.

Read My Pins

Ask Sarah Palin after everyone learned she spent $150,000 on clothes. Ask Cindy McCain after the media slammed her for wearing an outfit that totaled $300,000. Ask Hillary Clinton when the media needled her for restlessly changing her hair style again and again. Ask Michelle Obama after Robin Givhan gushed that when Obama “bounded onto the stage in her sleeveless dresses, with her muscular post-Title IX arms in full view, the definition of a strong woman changed.”

It matters what a woman in politics wears. Every sartorial choice has significance-painting a political woman as shallow or thoughtless or mannish or callous or strong or rebellious or docile. But while women in politics should know all of this, the choices they make are often either thoughtless (wearing a $300,000 outfit while your husband tries to brand his opponent as an elitist) or carefully constrained by the rigid roles the country expects them to play.

The messages are either clumsy and wrong or so subtle that people find in them what they will. For instance, after XX Factor’s Hanna Rosin saw a picture of Obama delicately plying a shovel while wearing a long belted sweater and stylish boots, Rosin said, “I’m beginning to think Michelle rebels against the strictures offirst lady life silently, through her outfits, the sartorial equivalents of a middle finger.” A fashion analyst wondered if Obama wore her famous purple sheath dress at the convention because purple is a mix of red and blue. But who knows?

However, a display at the Museum of Art and Design shows a leader who walks a bold but dainty path in female political fashion. These fashion choices are bold. The messages they send are clear, but they’re also whimsical and utterly feminine.

Secretary of State Madeleine Albright was the first female Secretary of State and famous for using her collection of costume jewelry pins to send gentle diplomatic prods. The display shows over 200 of her signature pins. Some of them are delicate but most of them are ponderous-the kind of pins you would need a very serious, sober suit to sustain-and so big that they tore holes in Albright’s serious suits, which she then covered with larger pins.

She began using pins to send messages after the state-run Iraqi press called her a serpent in a cunningly titled poem, “To Madeleine Albright, Without Greetings.” It went something like this: “Albright, Albright, all right, all right, you are the worst in this night.” The writer went on to weave in a menagerie of animal imagery, penning, “Albright, no one can block the road to Jerusalem with a frigate, a ghost, or an elephant” and calling Albright an “unparalleled serpent.” The next time she went to Iraq she wore a jeweled serpent entwined around a stick, with a diamond hanging down for its tongue. When the media asked her why, she said it was because the Iraqis thought she was a snake.

She wore wasps when she wanted to send a message with a bit of a sting. She wore a jeweled bug, made of amethyst, chalcedony and gold, to Russia after a Russian official bugged the State Department. She used turtles to complain about the slow progress of peace. She wore balloons to symbolize satisfaction when talks were going well.

When she met with Vladimir Putin and wanted to send a message about Russia ignoring human rights violations in Chechnya, she wore three chubby, Buddha-like monkeys miming “hear no evil,” “see no evil,” “speak no evil.” When she met to navigate talks about nuclear arms, she wore an abstract representation of an arrow, made of anodized aluminum. A diplomat looked down at her pin and said, “Is this one of your interceptor missiles?” She told him, “Yes, and as you can see, we know how to make them very small. So you’d better be ready to negotiate.”

She seems to wear them with a knowledge too many women in politics forget-the knowledge that’s she’s a woman and not a man, and that any disadvantages to being a woman are best deflected with a sly sense of humor instead of acting like a man. A foreign minister mistakenly told reporters that he enjoyed hugging Albright because of her “firm breasts.” Of course outrage followed, which Albright deflected when reporters asked what she thought and she quipped, “Well, I’ve got to have somewhere to put those pins.” Then, of course, she bought a red fox pin-for when she was feeling flirty-to commemorate the occasion.

She told Newsweek, “I love being a woman and I was not one of these women who rose through professional life by wearing men’s clothes or looking masculine. I loved wearing bright colors and being who I am.” It’s an intentional, dignified use of femininity to send a political message that’s bold and clear. It’s fashion that bends the rigidity of female roles, while at the same time not sacrificing the femininity it’s absolutely just that female leaders keep.

The most trusted man in America?

From the Daily Intel: Why neo-conservative pundits love Jon Stewart.

Back in April, when the debate over torture was roaring, Jon Stewart invited Cliff May, a national-security hawk and former spokesman for the Republican Party, to come on The Daily Show and defend waterboarding. May was hesitant. He thought Stewart would paint him as a crazy extremist. The audience would jeer. It would be a disaster. “I was apprehensive about going on, even though I’ve been on TV for a dozen years,” says May. “A lot of my friends told me: ‘Don’t do it. You’re meat going into the sausage factory.'”

But May had a change of heart after soliciting advice from his friend Bill Kristol, editor of the Weekly Standard. “Kristol told me: ‘You’ll be pleasantly surprised. He doesn’t take cheap shots. Jon is smart. You’ll do just fine.'” Kristol proved to be right. Stewart’s interview of May – a crackling, lengthy debate about where to draw the line between freedom and security – produced one of the most clarifying discussions about torture on television. “Literally, this is the best conversation I’ve had on this subject anywhere,” May told Stewart.

White House Art

From the Wall Street Journal: Obama is changing the art on the White House walls.

Their choices also, inevitably, have political implications, and could serve as a savvy tool to drive the ongoing message of a more inclusive administration. The Clintons received political praise after they selected Simmie Knox, an African-American artist from Alabama, to paint their official portraits. The Bush administration garnered approval for acquiring “The Builders,” a painting by African-American artist Jacob Lawrence, but also some criticism for the picture, which depicts black men doing menial labor.

Last week the first family installed seven works on loan from the Hirshhorn Museum and Sculpture Garden in Washington in the White House’s private residence, including “Sky Light” and “Watusi (Hard Edge),” a pair of blue and yellow abstracts by lesser-known African-American abstract artist Alma Thomas, acclaimed for her post-war paintings of geometric shapes in cheery colors.

More on the NEA

From the New York Times:

Engagement, or echo chamber?

Nicholas Kristoff at the New York Times: The Daily Me.

When we go online, each of us is our own editor, our own gatekeeper. We select the kind of news and opinions that we care most about.

Nicholas Negroponte of M.I.T. has called this emerging news product The Daily Me. And if that’s the trend, God save us from ourselves.

That’s because there’s pretty good evidence that we generally don’t truly want good information – but rather information that confirms our prejudices. We may believe intellectually in the clash of opinions, but in practice we like to embed ourselves in the reassuring womb of an echo chamber.

Staff arts & culture position in the White House

From the New York Times: Cultural Post at the White House.

The White House declined to describe the position in detail, since Mr. Dale’s appointment has yet to be formally announced. Mr. Ivey, a former chairman of the National Endowment for the Arts, said he expected that the job would mainly involve coordinating the activities of the National Endowment for the Humanities and the Institute of Museum and Library Services “in relation to White House objectives.” Although there have been staff members assigned to culture under past presidents, they usually served in the first lady’s office, Mr. Ivey said.

More on vegetables in the White House

From the New York Times: Michelle Obama’s Agenda Includes Healthful Eating.

White House officials say the focus on healthy living will be a significant item on Mrs. Obama’s agenda, which already includes supporting working families and military spouses. As the nation battles an obesity epidemic and a hard-to-break taste for oversweetened and oversalted dishes, her message is clear: Fresh, nutritious foods are not delicacies to be savored by the wealthy, but critical components of the diets of ordinary and struggling families.

As a resident of a mixed-income neighborhood, I can attest to the fact that if Michelle Obama can pull this off in urban neighborhoods and low-income families, it will be a major win. I think we can expect Big Ag to object.

Why the several postings about the White House food agenda on this blog, you might ask? I firmly believe that in the “world that ought to be”, good food as it grows from the ground would be consumed with great joy – along with the delicious concoctions we’ve come up with (yes, including burgers) – all in moderation. Health would not be a luxury for the rich – and in fact, the developed world is somewhat unique in that the poor people are often eating less nutritious food than the wealthy. It never hurts to get a push from the top.

Victory!

From the CBS News Political Hotsheet: Vilsack Adviser Predicts Vegetable Garden On White House Lawn By Summer.

Vilsack adviser Neil Hamilton, the chair and director of the Agricultural Law Center at Drake University Law School in Iowa, says yes.

“I believe that by this summer there will be a garden – another garden, a vegetable garden – on the White House lawn,” Hamilton said at a weekend legal seminar at Yale University.