Friday, April 30, 2010

Teaching Authenticity

For those of us who teach courses in popular music, one inevitably arrives at the "authenticity" moment. Perhaps it is part of the syllabus, a unit of thinking about the methodology of pop music studies. Perhaps it arises more organically, say in a class discussion when someone offhandedly uses Miley Cyrus as an example of the decline of western civilization--the actual word "authentic" is rarely used, but that ideology is usually in the background of complaints about auto-tuning and controlling manager/fathers or whatever. However it happens, I think most of us have a usual schtick for this topic. I like to do a little brainstorming, first coming up with a list of pop artists that the class things is "real"--this tends to include the likes of Bob Dylan and Jay-Z, depending on the relative level of snobbery in the local student population--and then from that, extrapolate a list of characteristics, musical and otherwise, that link those artists they find most real. The point of the brainstorming is that we eventually hopefully realize as a class that the ideology of "authenticity" is usually a means to promote our own arbitrary aesthetic interests.

So, fine. That's some good teachin'. However, the other day I was musing aloud in class about Green Day. As you may know, last year Green Day collaborated with Berkeley Rep to produce a musical theater version of their 2004 hit album American Idiot, and this production is now on Broadway itself, garnering positive reviews. The whole thing is, of course, shocking for those of us who had some part of our musical consciousness formed by punk rock. I grew up in the East Bay, where Green Day is from, and although as a teenager I was absolutely nowhere cool enough to have been part of the Gilman punk scene that gave birth to Green Day, Rancid, etc. in the 1990s, I did know people who were, and I remember the community horror that greeted Green Day's decision to sign with a major label and release Dookie. The term "sell-out" was widely used, with no irony intended; it was as if Green Day had singlehandedly destroyed DIY punk rock. And now...they are doing a Broadway musical?!

I have enough academic distance from that scene to realize the ideology of authenticity at work. (And of course, the boundaries between punk and the camp theater of glam rock were never exactly distinct to begin with.) But anyways, the whole point of this post is that in musing aloud about this on different occasions, in the presence of students who grew up not in the 1990s but after the turn of the millennium, I realized that not everyone was registering my mock-pretend outrage at the idea of a punk rock musical. To be sure, some understood why the whole thing was funny. But for many, my uncomfortable realization was that they saw no problem with that concept. There are rock musicals, hip-hop musicals, why not a punk musical?

And that's when it hits me: there are many people out there who, rather than needing to be taught to deconstruct authenticity, actually need to be taught what authenticity is in the first place. For if you grew up over the turn of the millennium, authenticity was never really part of the game. I feel like I've been trying to disrupt my student's commitment to authenticity for years now, only to find that now I need to figure out how to teach it. For you can't really understand a number of musical trends of the past fifty years without--the weird commitment of English guitarists to the blues, say, or the backlash against disco--without understanding the idea of authenticity. It's the fundamental building block of a great deal of music criticism; heck, you could say that the discourse of authenticity is a fundamental building block of American popular music itself.

I certainly don't miss it now that it's gone--and I exaggerate for effect here, of course--but as a teacher I'm going to need to make some adjustments.

Friday, April 23, 2010

An Ethnic Reunion

There was an interesting article in the Philadelphia Inquirer today. One of the old Italian neighorhoods of North Philly is having a reunion. Up to six hundred old denizens, mostly Italian members of the now-defunct parish of Our Lady of Pompei who fled to the suburbs in the 1970s, are expected to come back to their old stomping groups. Most of them now live in Northeast Philly, or even further out in the suburbs, and were apparently itching for the chance to reconnect and reminisce.

One of the things I talk about in my research is the process by which various immigrant communities in the United States have slowly been assimilated into the category of "white," with all the political and cultural power that entails. This is obviously a process that has been going for a very long time; books like David Roediger's The Wages of Whiteness and Noel Ignatiev's How the Irish Became White trace that history especially amongst working-class Irish immigrants in the early 19th century. The latter book has always been particularly fascinating for me, both because it examines my adopted home of Philadelphia, and also because it was the rare work of critical race studies that actually crossed over into a mainstream market. It was widely discussed when it came out, and I seem to recall that supposedly Bill Clinton read it avidly.

The 1950s, which is of course my own area of interest, saw another surge in assimilation. There were many factors involved in this, but the one particularly highlighted by this reunion was the geographic mobility created by World War II. Cultural coherence requires spatial coherence; it's a lot easier to maintain traditions if you live in one neighborhood, where your church, community center, social club, local newspaper and radio station, and all sorts of other institutions can cater to your language and traditions. As soon as the massive migration to the suburbs began, those institutions became to difficult to maintain--just by way of example, the percentage of radio stations broadcasting in a language other than English declined by 40% between 1942 and 1948.

And all of this happened in dialogue with academia and politics. The period after World War II was when the mainstream social science consensus began to use the term "ethnicity" rather than "race," that is, privileging culture over biology. This was a move made with liberal intentions, although certain categories of human beings--especially African Americans and Asian immigrants--were still subsumed under the idea of "race" rather than "ethnicity." This tension over who counts as biologically different as to who counts as merely culturally different was especially reflected in McCarran-Walter Immigration Act of 1952, which substantially liberalized immigration for those coming from Europe, while maintaining strict quotas for those from other parts of the world. This was accomplished by pinning immigration to the 1920 census: each European country was given an annual quota equal to one-sixth of one percent of the citizenry in 1920 descended from that country. Effectively dismissing immigration from the previous three decades, the Act had the effect of encouraging more immigration from Ireland and Germany, and less from Eastern Europe. The rest of the world, however, was treated not along ethnic lines, but the racial: immigrants from Asian countries were counted together as one block, rather than individual countries. And if you were, say, a Brit of Asian descent, you were counted as Asian rather than British, and had to fight for one of the 2,000 slots allowed from Asia.

As the country yet again begins to discuss immigration reform, it's always good to remember these past engagements. Purely racial arguments are rarely made in mainstream political discourse anymore; we more often hear arguments having to do with culture. Language, obviously, but as one defendant of the proposed Arizona anti-immigration law puts it, also fashion, even shoes. When someone starts spouting nonsense like this, it's a good guess that there is a racial logic underneath!
Rep. Bilbray: They will look at the kind of dress you wear, there’s different type of attire, there’s different type of—right down to the shoes, right down to the clothes. But mostly by behavior it’s mostly behavior, just as the law enforcement people here in Washington, DC does it based on certain criminal activity there is behavior things that professionals are trained in across the board and this group shouldn’t be exempt from those observations as much as anybody else.

(h/t)

Saturday, April 17, 2010

A Country Church



For the past year or so, I've found myself on the email list of the vestry of a small parish church in Bedfordshire, England. It's not spam; one of the committee members not only shares my last name, she also uses the same email service and there is only one letter's difference between our email addresses. I've occasionally tried to write back to the priest, and she is very apologetic and promises to fix her email list, but every week or so I get another email from her or one of the other vestry members about scheduling a meeting, finding volunteers for events, and so on.

Part of me actually enjoys this strange little window into the small life of a country church in England, so far removed from my circumstances here. It's a pretty church, probably Saxon in origin, with an important 13th century fresco of Christ on the cross. It appears to be a small congregation, but a cheerful one; Holly won the "Silly Hat" competition, and over the holidays the singing carolers raised almost £300 for the Children's Society. I hope the infestation of "deathwatch beetles" (sounds grim!) is controlled before it destroys the roof. I wonder if they could use help fixing the web site, it looks a little worn around the edges.

The bizarre thing is, I have the feeling that I might have actually been to this church--it's located right near where Mary went to veterinary school outside of London, and I had a habit of stopping to examine pretty old churches during my visits to her. I could be imagining it, but the pictures on the web site look awfully familiar.

However, Rosemary Gentry, if you are out there, don't be miffed if your fellow committee members are not consulting you to schedule servers for Sunday services! They are trying their best.

Either that, or this is an elaborate hoax that is going to end up with me wiring thousands of dollars to Nigeria.

Thursday, April 15, 2010

Speaking of April

Speaking of historic dates in April, did you know that April the 15th is very important as well?

The sinking of the Titantic, in 1912, which hit the iceberg on the 14th but succumbed on the 15th:



The death of Abraham Lincoln in 1865--similarly, he was shot on the 14th but succumbed on the 15th:


On the plus side, it's also the birthday of Leonardo da Vinci (1452), Emma Watson (1990) and myself! (1980)

Wednesday, April 14, 2010

Remember Remember the 14th of April



It's one of those moments in the political cycle where Confederates in the attic are popping up all over the place. This current outcrop is largely thanks to the beloved governor of my current state, Bob McDonnell, who saw fit to give what I'm sure he hoped was some mild political payback to pro-Confederacy groups, in the form of declaring April "Confederate History Month."
WHEREAS, it is important for all Virginians to reflect upon our Commonwealth's shared history, to understand the sacrifices of the Confederate leaders, soldiers and citizens during the period of the Civil War, and to recognize how our history has led to our present...
Pretty bold claim there, Bob, to argue that history leads to our present. Anyways, too much digital ink has spilled on this already, so I just want to point out that although our country makes a habit of constantly remembering the Civil War, we equally have a bad habit of resolutely forgetting its immediate aftermath, that period listed in our history books as "Reconstruction" and the vigorous retrenchment back into white supremacy that followed thereafter. This is a pity, because as a matter of history "leading to our present," it's not actually the war itself which has the tremendous impact. War is war; it is what is, and its impact on individuals and their survivors rarely lasts more than a generation. It is how society deals with that impact, through memorials, cultural remembrance, legal policies, or whatever else, that provides the impact for future generations.

Take, for example, the ridiculous complaints by right-wing critics that Obama is being disrespectful when he puts his feet up on the presidential desk. Kathryn Lopez of the National Review voiced the complaint succinctly on Twitter:
(h/t Wonkette)

Many of us have chortled over the inanity of this complaint, given that the historical record is full of photographs of just about every single president, George W. Bush included, with their feet up on that same desk. But where does the anxiety about feet upon the presidential desk come from? Simply put, this is fear is a relic of Reconstruction. As the US government attempted to introduce representative democracy to the south after the war, one of the results was a large number of black men--often the majority, after all--holding elected office. One of the tactics by which white supremacists fought back was to claim that these black politicians, being uneducated and naturally uncouth, were disrespecting the office. One of the famous images from D.W. Griffith's racist mythology The Birth of a Nation is precisely the same image that gave Lopez such indigesgtion, lawfully-elected black politicians with their feet on official desks:
(h/t)

So in memorial of Reconstruction, and its legacy of racist anxieties, I want to point out that today, April 14th, is often given by periodizing historians as the end of Reconstruction, and the beginning of the resurgence of white supremacy in the south. On this day in 1873, the Supreme Court handed down its decision in the so-called "Slaughterhouse Cases", holding that the Federal government had no right to protect the voting rights of African Americans. And that same day, down in Louisiana, a group of white militia slaughtered over a hundred black citizens in the Colfax Massacre, burning down a courthouse in which a large group had taken refugee, and shooting all those who tried to escape the flames. It's a bit much to claim that this single day sent our country down the path of Jim Crow laws, segregation, and the Ku Klux Klan, but it was certainly the beginning.

Happy April 14th.

Friday, April 9, 2010

Second Thoughts

Okay, so sue me, I can't make my mind up about what blogger template to use. It's funny how limited the options are.

Actually, however, my second thoughts are regarding the iPad. Last we spoke, I couldn't possibly think of a reason to get one. Now, I'm suddenly contemplating replacing my laptop with an iPad. What changed?

1. The name. I was one of those who thought that this name was proof of the fact that Apple apparently has no female employees, and I couldn't get the image of that MadTV sketch out of my head. Such is the power of marketing that I've completely forgotten about it now.

2. Portability. I found out that you can run Keynote presentations straight off of your iPad (presumably via some sort of adaptor to connect to your projector). I spend a lot of time lugging my laptop around for that purpose, and wouldn't mind lugging around something much lighter. Also, as I said in the comments to my previous post, I fantasize about having my students upload their papers through Blackboard, putting them on my iPad, and grading them there. And having the built-in connection to AT&T's network is much more elegant than buying one of those laptop connect dealies. Thanks to the iPhone I've grown to love having internet with me at all times. In general, as I've started to have to have my laptop with me nearly all the time, I wouldn't mind slimming down some. Or a lot.

3. Consumption. The general consensus has been that the iPad is a device built for consumption, not production. As someone who produces a lot of writing, nearly every single day, this was one of the main problems for me--the lack of a keyboard would drive me up the wall, and adding an external keyboard seemed to defeat the purpose. However, in reading some reviews of the product, I realized that my engagement with computers actually takes place in two different spheres. On the one hand, when I'm doing actual work on my laptop, I tend to be seated at my desk. That's because I like to have a proper chair, to have my books spread out, and to be near my printer and scanner, etc. On the other hand, I also spend a lot--a lot--of time simply consuming the internet. I'm one of those people who reads multiple newspapers virtual cover to virtual cover every day, stay logged onto Facebook all day, am always reading blogs and commentary, etc. I tend to do this on my laptop while slouched on the couch. I also have been browsing the internet more and more on my iPhone. That's partly because I don't have internet at my home in Virginia at the moment, but because of that I've actually grown fairly accustomed to using a an input-less version of the internet, and it's actually not so bad.

My realization with regards to my computer habits is that actually my laptop--I have one of those older black MacBooks--is not perfect for either of those tasks. When seated at my desk, I would prefer to have a full-size keyboard and mouse, and to have various external devices permanently plugged in without having to get everything set up everytime I switch places. And I would prefer the expansion capabilities and overall greater robustness of a desktop. And conversely, when I'm slouching on my couch obsessively reading through the Times, the laptop is a bit clunky. Sometimes I would like to be able to lie down and read, and I often find myself nearly dropping my laptop, or finding it in the way. An iPad would actually be about right.

Caveats: I still am not pleased with the file management situation. Apparently each application has its own file manager, and if you want to transfer files from device to device you have to do it through iTunes just like you do now with music. That's a big drawback. It's bad enough with the music; as someone who creates several playlists every single day and puts them onto an iPod for teaching, I know all too well how badly this system works for anything other than a handful of documents.

The biggest caveat is that I will still need a real computer in my life, and I would like it to be a fairly robust desktop machine. Depending on my employment situation I might have an office computer to use, but I'll still need something at home. So my laptop will actually be replaced by two devices, and that adds up. The most expensive iPad with 3G ($829; not having the 3G service defeats the purpose for me) and a decent iMac ($1499) together run substantially more than just a MacBook Pro. Or I could go big and find the money for the Mac Pro, which with its ability to expand (and use third-party monitors and such) might save money over the long run and indulge my geekier side.

And needless to say, I can't possibly imagine buying a first-generation iPad. I'm hoping that my laptop hangs on for at least another year, and that by then some of the kinks will be ironed out. Stay tuned for more of this riveting journey towards iPad acceptance.

Monday, April 5, 2010

Happy Easter from the Pattis



I thought I could find some Easter-related song by my other favorite Patti, Page, but she seems not to have gone down that route. Best I could find was a version of "What a Friend We Have in Jesus" in her trademark multi-tracked style. Somebody seems to have put it on YouTube with a clip of her singing...something completely different. It's a little weird, but what can you do.



I highly recommend playing them at the same time. Smith occasionally drowns out Page, but the quieter bits of Smith let Page come through nicely at times, and the dueling organs can be fun if you work the volume levels a bit.