Category: Rants

On the concept of research and teaching as “two tracks”

Adam Grant, a professor of management and psychology at the Wharton School, recently published an article which proposes to solve a set of related problems that supposedly plague higher education in America. In truth, he’s actually offering a myopic view of a non-problem, or rather one that could only be envisioned from the privileged perspective of a faculty person working at an elite institution.

Adam Grant, “A Solution for Bad Teaching,” New York Times (2/5/2014) URL: http://www.nytimes.com/2014/02/06/opinion/a-solution-for-bad-teaching.html

Grant apparently shares the widespread inability of faculty at elite research institutions even to imagine the realities of life in the rest of American higher education. Elite faculty seem always to assume that the types of things they see happening in their context are typical across faculties in American institutions. This seems to me to be a major problem for the way discourse about education reform proceeds, in this country, especially since their “elite” status seems to grant them access to prominent public fora like the New York Times, while people who better appreciate what’s going on may be relegated to bitching on facebook or taking the time to put together a hasty sketch for a blog post.

Yes, Grant reports on research, and offers seemingly persuasive commentary about how different are the skills of “research” and “teaching.” He takes note of the different teaching practices and outcomes among non-tenured versus tenured faculty. This seems solid enough, and probably does reflect the experience of faculty and students elite “R1” research universities, but it has almost nothing to do with what goes on in undergraduate education at the thousands of liberal arts colleges in this country where most faculty and students work.

According to the Carnegie Foundation for the Advancement of teaching, no more than 10% of students in higher education, nationwide, are served by such large research institutions. The faculties are, ultimately, also a small percentage of those who are employed in higher education. The other 90% attend schools with faculty who are far less focused on research.

For data supporting this claim, see the table of institutions, and the analysis of enrollments, at http://classifications.carnegiefoundation.org/summary/ugrad_prog.php.

Grant’s proposals are thus a set of solutions in search of a problem, or rather, they are a symptom of misreading the economic and practical landscape of higher education in America.

The fact that R1 schools have problems with the teaching quality of the people they hired to be researchers is not news. Universities realize that they often have poor educational outcomes from their most highly valued employees, and can’t continue to economically justify their model.

Cry me a river.

The R1 schools went out of their way to hire people as researchers, reward them for their research, and then gave them tenure and new administrative roles. It shouldn’t surprise anyone that these people don’t make teaching their focus.

So, in response, Grant supports an idea of separating “research” track professorships from “teaching” tracks. You might think this would create a caste system in research universities. And maybe it would. Or maybe, it would just reflect what is already in fact the practice of higher education in America.

Because in this country, we already separate research from teaching.

At present, this separation takes place in the job market, where universities go out to hire people to do research, while colleges (liberal arts, general, and community) hire people to teach.

You have to look at the system as a whole, and measure what is going on at the inter-institutional level. System wide, we heap economic and professional rewards (including time and support for research and writing) onto a very select, very lucky few. These are the PhDs who get hired to work at research universities, where they benefit from lower teaching loads. The vast majority of newly-minted PhDs go without employment altogether, and the rest of us are hired to teach more, and be paid less to be teachers at teaching-focused schools. Some of us get tenure, and the rest operate as minimum-wage drones in the adjunct system, or leave academia altogether.

What the “dual track tenure” proposal amounts to is an attempt to impose the conditions under which most of us already operate onto the much smaller pool of academics who make up the faculties of the R1 universities. What he is actually proposing is that R1 institutions act more like the teaching schools.

Schools in America have shown that they don’t really need any empirical research in teaching effectiveness to justify pushing more and more of the teaching burden off of tenured research faculty, and onto the non-researching contract faculty whose jobs are focused on teaching. That process is long underway. There have never been more students being taught by exploited adjuncts and only slightly less exploited term employees. And besides, Grant reports, as if its a bonus: there’s research that shows research has nothing to do with teaching outcomes, and even research that says students learn more from non-tenure track faculty! Wow. Cool.

We’ve heard this before. Of course any research that reports that students learn more from such faculty will warm the hearts, I’m sure, of administrators everywhere. But what it really means is all professional full time teachers are in trouble.

As a well prepared and thoughtful but ordinary liberal arts college professor, I’d like to do more work involving research and publishing writing. But that’s unlikely ever to happen. I personally find it almost impossible to incorporate a serious writing practice into a life of “scholar-teacher-family-man” with a 4/4 teaching load. Others in my position, particularly those who decide to forego family life, seem to fare better at maintaining one foot in the world of “research.” But most of us are far less productive, as researchers, than our counterparts in the universities. I’m therefore often envious of colleagues who have secured one of those elite jobs where they get paid to act like an expert in their subject field by conducting obscure and technical research for publication in journals that will only be read by other elite researchers.

Somehow, it doesn’t make me feel better to see that R1 institutions are re-thinking their model. In fact, I would seem to occupy precisely the middle-ground that has the most to lose in this discussion. I have tenure and am being paid a middle-class salary, with benefits, to do something for my students that a slightly younger Ph.D. would probably be willing to do for approximately 1/6 the cost (as an adjunct), or 1/2 the cost as a term-employee.

Along comes Grant with his unique economic logic. If the R1 universities took Grant’s advice, they would begin hiring a new faculty to work exclusively as teachers. They would presumably be paid less and expected to teach more. These faculty would exist alongside the (presumably thereafter much smaller) research faculty, who would be paid the same or more, and would have research and publication as their primary or only responsibility.

Grant even writes, at the end of a colossally tone-deaf passage, that while “[r]eplacing adjuncts with tenured teachers would cost more,” nevertheless “there are ways to offset that, perhaps by funding more research with grants.” These kinds of comments drive University Provosts and Academic VPs to drink. “No problem… just turn your existing adjuncts into tenured professors.” Ha. What he’s actually proposing is to kill off any remaining pretense that university faculties are groups of scholars working on disinterested research in recognized areas of important inquiry. Research funded by external grants depends on corporate donors or political processes that are by definition, interested. His proposal could either bankrupt the universities or unacceptably narrow the range of subjects where serious research was happening; or do both.

By the way, have you ever noticed how difficult it can be to get an ordinary American undergrad to benefit from working with a typical peer-reviewed article or serious monograph? Most academic writing is pitched to a graduate-student or professional-peer audience level. You can just imagine Grant’s proposal exacerbating this problem. On the other hand, books written for undergrads—i.e. textbooks—are sometimes models of the worst sort of scholarship, offering at best only minimal engagement with sources, or failing to model critical interaction with the literature, or studiously avoiding “technical jargon” at the expense of providing a foundation for further study in the field. What might actually be needed is MORE, not less, research and writing done for the college-educated lay-audience, written by people who professionally teach young adults, and LESS not more, advanced and highly technical research. But what do I know?

In this new system proposed by Grant (and others) the elite will become even more elite. Congratulations to you rock star researchers! When the rest of us finish grading our undergraduates’ papers, we might read your latest article. If we can find the time. I will go ahead and assume we can understand what you’re working on. And to all you new teaching-track professors, welcome to the working week! Let’s hope that we don’t all compete each other out of a living wage.

Post-Script (Feb 10, 2014):

According to an April, 2013 report by The New York Times, the average salary of professors at Private Research Universities is over $160,000 per year. I just want to note that this is more than three times my current salary. This emphasizes my point nicely: in this country, we sharply divide “research” from “teaching” already, reward “research” by an elite few far more than we reward the “teaching” by the many. (The same article reports that assistant professors at schools like mine average just over $60,000 per year, which, for the record is around 20% more than my salary as an associate professor at a relatively poor school in rural Appalachia). But please don’t think I am complaining about my modest salary. Because I know that things could be a lot worse. The same report explains that tenure or tenure track jobs make up only 24% of the academic workforce. Yes, that’s right, about 76% of those presently teaching in colleges and universities in America are adjuncts or defined term lecturers. Bear in mind that the average pay for an adjunct teaching a college-level course is less than $3000 per year, with no benefits. Accordingly, we can say that these people, who do most of the teaching for us, are paid, usually, only 1/3 to 1/5 what I am paid for the same teaching load. If you needed any, this is strong evidence that Prof. Grant is completely out of touch and off his rocker when he nonchalantly admits that moving teachers from the ranks of the adjuncts into tenure-track teaching-only jobs might be kind of expensive. You think so? Oh, but we can just get some grants to support the work of all those research professors. Is Grant really this clueless?

Sources:

Tamar Lewin, “Gap Widens for Faculty at Colleges, Report Finds,” New York Times (4/8/2013) URL: http://www.nytimes.com/2013/04/08/education/gap-in-university-faculty-pay-continues-to-grow-report-finds.html

William Pannapacker, “Just Look at the Data, If You Can Find Any” Chronicle of Higher Education (6/17/2013) URL: http://chronicle.com/article/Just-Look-at-the-Data-if-You/139795

Random Self-Censored Facebook Comments about the Nye / Ham “Debate” in Part being Focused on a Reply to Albert Mohler

This is possibly one of the most ill-conceived blog posts I have ever done. It consists mainly of things that I wrote that I was going to post to facebook, but which I instead cut and pasted into a file on my computer. I then brushed and polished it slightly for publication here on my blog.

Basically, what follows is a random, barely edited, not at all organized series of paragraphs each of which was once almost going to appear in a comment on some unsuspecting friend’s post. Until I censored myself. I could see that I was being too long winded for facebook, and that I was going to cross the line in ways I didn’t want to. All the comments were on threads on the topic of extended debate-show between Bill “The Sciecnce Guy” Nye, and Ken Ham of Answers in Genesis, which took place last night. (I’m not posting a link to that. They are everywhere at the moment. If you’re reading this at some later date and haven’t heard about it, you can google it.)

Bill Nye looks sideways at Ken Ham during their Debate (2/4/14)
Bill Nye looks sideways at Ken Ham during their Debate (2/4/14).

I don’t want to offend, but I’m afraid it will be impossible to avoid offense. Because these words were pouring out of me and I wanted to record them. So here goes.

The bulk of this rant started in response to the following article, which, I suspect, you’ll have to read in order to follow what I write here.

Albert Mohler, “Bill Nye’s Reasonable Man — The Central Worldview Clash of the Ham-Nye Debate,” AlbertMohler.com (2/5/2013) URL: http://www.albertmohler.com/2014/02/05/bill-nyes-reasonable-man-the-central-worldview-clash-of-the-ham-nye-debate/.

For those who don’t know, Albert Mohler is president of the Southern Baptist Theological Seminary, the flagship graduate school for preachers and scholars who belong to the Southern Baptist Convention. He’s an avowed young earth creationist, but he’s also a remarkably well read scholar. For my money, Al Mohler is the most educated man in conservative Christendom, and I always seem to enjoy reading his take on things. That is not to say that I agree with him—I almost never do—only that I respect his learning.

Mohler had a front row seat at the debate, and he shares a stimulating set of reflections about his experience. He seems to be striving for balance and respect. He praises Nye’s outfit and pen, and essentially calls him a man of style. Nevertheless, something about Mohler’s analysis of last night’s has just set me off.

First off, let me say that I do think Mohler is right when he says the debate “was [really] about the most basic of all intellectual presuppositions: How do we know anything at all?” That is, Mohler points out that the real debate concerns what is properly called epistemology. The problem is that Mohler falsely characterized the epistemic commitments of Bill Nye.

Selfaganda from Al Mohler’s Twitter Feed

The way I read the debate, Nye clearly stated in so many words that his mind could be changed by evidence, whereas Ham (and Mohler by self-identification) insist instead that their minds cannot be changed no matter what evidence is presented to them. Mohler and Ham are already standing on Christ the solid rock, they already have all the “answers in Genesis.” Bill Nye is ready to be persuaded to think otherwise than he does if evidence is presented that contradicts the evolutionary model.

Yet Mohler writes, falsely, “Both men were asked if any evidence could ever force them to change their basic understanding. Both men said no.”

Actually Bill Nye explicitly said that his mind could be changed by evidence, whereas it was Ham alone who denied any possibility of ever changing his mind, and asked for our understanding and sympathy, because his stance is rooted in faith in God’s word. Mohler basically denies the facts of what happened on stage. That bummed me out, because I usually respect Mohler’s ability to conduct a relatively accurate analysis of a person’s position, in spite of his disagreements.

Of course there is a big difference between a scientific way of knowing the world, and a religious way of knowing the world, and it was fully on stage for all to see last night. There is a clash of worldviews, but it is not the case, as Mohler suggests, that naturalistic science is one kind of “faith” and a Biblicist creationist science is just another equivalent kind of “faith.”

I don’t need to bother to deny formal or structural similarities between science and religion to make my point. I can allow a large number of parallels. There’s no denying that homologies exist between “Magic, Religion, and Science,” or that in practice, trust plays a role in scientific knowledge. Of course scientists must have some faith in authority, etc., in order to sustain scientific practice. This is what we have peer review for. “Trust” in the reports of others is a fundamental aspect of any sane epistemology; the alternative is either radical skepticism or an unrealistic individualism which requires each person to construct their own complete picture of the world through a comprehensive empirical study. Of course scientists trust other scientists to be truthful and accurate and careful in their work and reporting of their work.

Even so, when it comes right down to it, science remains that way of knowing the world which is flexible and open to persuasion and change—it does not set prior authority into a position where it can be considered a sacred inviolate authority. On the other hand, religion all too often, at least in its fundamentalist mode, is characterized precisely by its refusal to let go of prior traditional authority, which it treats as sacrosanct and perhaps, in a fundamentalist mode, inerrant.

Nye’s epistemology is empirical. He seems content to accept a pragmatic or verificationist approach to truth. This is a hallmark of what is usually called simply “science.” Of course, Ham insists on calling this “observational science,” and wants to insist that such science cannot tell us anything about the past. That is, of course, poppycock. All that humans have access to is technically “in the past.” As it has been observed by Faulkner, the past isn’t even past. When you read a scientific report of an experiment, you’re reading about something that took place last week, last year, or decades ago. The same laws of nature and Nature’s God applied then as now.

For this reason, all scientists that I know would insist that all science is observational; palaeontology, geology, etc., are rooted in observation of phenomena that are present in the world, but that have, undeniably, a past.

One cannot begin a debate on paleontology and evolution by insisting that “science” per se is not qualified to investigate the past because “we can’t observe the past.” This is hogwash, and Mohler’s claim that it is rooted in a different notion of epistemology (what he calls a worldview) just covers up the plain intellectual bankruptcy of Ham’s version of creationism.

Science’s empirical epistemology works by starting with observation, modelling what is observed, and then testing the adequacy of models through continued experimentation, which is actually just doing more observing. Science consists exactly in this and in this only: proposing and testing models that can be used to explain observed phenomena. When models fail to account for observed experience (empircal data) then they must be revised or thrown out.

You can thus argue that scientists do trust in and have faith in their models and theories—they do because they are hard won products of hard labor by people over long periods of time—but you can’t simply invoke this fact in order to plead that there is a moral equivalence to fundamentalist intransigence on the authority of (one interpretation of) the Bible as God’s word. Prior models and authorities just aren’t God’s word for scientists. They are starting points from which the work goes on.

Ham and Mohler are epistemic authoritarians, which is to say that in addition to their normal practical use of everyday science for getting around and feeding themselves, they also hold on tightly to a script of “Answers in Genesis” (and all of scripture). Their scriptural “answers” pre-determine how they will answer questions raised by any new observation or experience in the world.

Like normal scientists, people of faith must of course rely on observation and modeling for going through the everyday world (as everybody must… you don’t read the bible to find out whether your car needs gas, you look at the gauge on the dashboard). And they also rely on a prior authority for what they claim to know. The difference lies in the relationship to prior authority. Fundamentalists treat the authority of the Bible as an inviolate absolute. No matter what they observe or experience—no matter what the empirical data is—if a proposed interpretation of that data conflicts with what they know about the world from authority, then that interpretation must be wrong. Thus all their “knowledge” of the world is necessarily shaped by the contents of an external authority (some theologians call this a heteronomic, as opposed to autonomic epistemology). In the case of Biblical Christian fundamentalsits, the chosen sacred authority is the Protestant canon of scripture, which, it is important to point out, is regarded as a direct revelation from God. Therefore their position ultimately rests on a presupposition that knowledge as such can only be grounded in divine authority; it cannot be grounded in rational agency of creatures. This is a theistic epistemology that locates knowledge with God. It resembles, in some ways, Neoplatonic idealism, but without Plato’s optimistic assessment of the inborn divinity of the human mind, which could then approach the divine source of knowledge.

Mohler suggests that Ham would agree that human knowledge cannot be created autonomously. Humans aren’t capable of coming to true intellectual understanding of the world or its origins without divine aid. Why? He argues this on the basis of his belief, which he says comes from the Bible, that the human intellect is completely corrupted because of the fall. (I would refer you to Mohler’s use of Romans chapter 1 in his post.) We have a noetic deficit that prevents us from truly knowing anything on our own. Instead, we must have a divine revelation of truth—not to say, a divine regeneration of our minds?

By the way, my own theological view, as a Biblical scholar, is that this Calvinist insistence on total depravity of the human mind diminishes way too far the notion of the status of human beings as being created in the “image of God;” it further ignores the fact that in Genesis, the fall itself was a fall from ignorance and innocence into knowledge that made us aware that we are “arum” that is, naked—but the word is a pun for “arum” meaning “clever,” “crafty,” “astute,” or “subtle” (like the Serpent). By eating the fruit of the Tree of the Knowledge of Good and Evil, we end up being more, rather than less, like God. At the end of the episode, God announces his intention to drive the humans away from Eden; why? “See, the man has become like one of us, knowing good and evil, so, lest…” (For all this see Gen 2-3). But this is Biblical exegesis, interesting, but irrelevant to my main “point” here, if there is one. Sorry to go off on a tangent.

Logically, there is circularity in Mohler’s type of argument. How does Mohler know that the Bible is the authoritative text, which can tell us authoritatively that, because of the fall, knowledge is only possible if we accept an authoritative text?

What makes Ham and Mohler so sure that it is the Christian Bible that constitutes the supernatural revelation which can ground our true “historical science”? Ham calls “scientific” ideas about the past “historical science” and suggests that his own faith in the Bible is just his version of “historical science.” So, if that is true, give us your scientific arguments for why this text, rather than others, offers the best account of historical origins? If you say, it is because God revealed the text, then I ask, how do you know this before you accept the text?

Why not the Tao Te Ching? Why not the Popol Vuh? Why not the Vedas? Why not the Quran? Why not just the Jewish Bible? Why not the Theogony? Why not the Theaetetus?

Calvinist fundamentalists might argue that, because we are noetically diminished through the fall, God reveals this truth to us directly in our hearts, convicting us in advance of his truth. Such a position makes all doubt of Christian truth claims into rebellion against God. Convenient for their claims to authority, but not persuasive in a scientific argument.

Returning to an earlier point, the creationists are fond of charging naturalist scientists of having their own faith. Ham characterizes it as a naturalistic anti-theist materialism. Yes, we all have “faith” in something, and we all must trust in prior authorities if we wish to know much of anything in this vast world. But the real question is, how do you arrive at this “faith,” and moreover, what status do you give it in your practical dealings with experience? What process do you acknowledge as being a pathway to knowledge? Not all have Christian faith, but anyone with any faith will ultimately have the problem of accounting for how they got that faith. Saying “God gave it to me, glory to God” is not a satisfying answer except to another believer. Rather, show some evidence. Yet, the difference between young earth creationism and ordinary science is that ordinary science is open to seeing the evidence, and will happily challenge and test anyone who claims that they have evidence for any thing. In contrast, young earth creationism likes to change the subject or ignore the questions. That’s what Ken Ham did to Bill Nye. Nye’s repeated refrain was, ‘show me the organism that swam up from one geological layer to another, and I’ll change my mind.’ We might as well put it more bluntly: until you can show me a fossil bed preserving humans and dinosaurs on the same level, conventional “naturalistic” geology will win this debate. Humans and dinosaurs did not overlap in history. In point of fact, at present there simply isn’t any evidence for the kinds of predictions that a young earth creationist might make. At the Creation Museum there’s a triceratops with a saddle on its back, where you can take a funny selfie. But there are no fossilized remains tying humans to dinosaurs. For young earth creationists, though, even though there’s no evidence, that would never stop them from believing it, or wanting to call it a science!

Scientists just don’t operate on this kind of faith. They don’t take for granted that prior authorities are always right no matter what, forcing them to adjust their interpretations of available data in light of that prior, infallible authority, or forcing them to begin all hypothesis making from the starting point of that authority. They alwas remain open to revision of their knowledge and of their faith. And THAT, that is the ultimate difference. It is the one that Ham and Mohler have denied in this exchange.

A final thought on this topic here is that when one grounds one’s truth-claims and epistemology in an external authority, one still has the problem of how that authority is interpreted. In other words: what does the Bible mean? How does Ham or Mohler know that they have interpreted the Bible correctly? That they have accurately seen its meaning for us and formulated their faith properly in accord with the contents of the authority? Only by having a securely grounded interpretive method could they be sure that their use of prior authority had validity in interpreting experience or making predictions.

In answer to such issues the fundamentalist believer may appeal to the human authority of tradition in interpretation (without admitting it is human), and instead will appeal to notions like “perspicuity” and the internal testimony of the spirit that infallibly does the interpreting on behalf of the believer. This is again a great circular affirmation of one’s own position: God inspires the believer to correct faith, and then inhabits the believer to ensure correct interpretation. Those who are without such grace and inspiration are, I supposed, simply doomed to noetic confusion.

These kinds of appeals cannot account for what we can easily observe happening when we look at the way texts are used within religious tradition. We can make a large number of empirical observations about the origins of the Bible, and we can investigate, using what Ham might call “historical science,” (but which I just call “history,”) the wide variety of human responses to it. We will certainly observe: differences in opinion among sects and commentators about the significance of texts, the formulation of doctrines, the essential contents of faith, etc., not to mention differences in responses to the text (faithful acceptance or rejection). Unless you maintain a radically sectarian vision of faith (namely that only that small minority of believers who share your same opinions have the true “light of God”), you have to embrace epistemological humility when it comes to the question: “what does scripture communicate?”

Twenty Two Creationists Ask Questions About Evolution

In another thread, we got interested in a brilliant photographic essay by someone else who was present at the debate. Photographer Matt Stopera records the comically insufficient ideas of twenty-two of those who attended the debate by asking them to write notes to evolutionists, posing for portraits with their messages. This viral sensation is layered in irony but is ultimately a sad testimony to the poor state of philosophical and scientific and theological education in this country.

Matt Stopera, “22 Messages from Creationists to People Who Believe in Evolution,” BuzzFeed (2/5/2014) URL: http://www.buzzfeed.com/mjs538/messages-from-creationists-to-people-who-believe-in-evolutio.

One picture from the essay, in particular, stuck in my craw:


Photo Credit: Matt Stopera (2014); reproduced here by claim of academic fair use. For link, see above.

The phrase “just a theory” needs to be banished from discussions of whether or not a body of science should be taught in classrooms. People seem to think the word theory means “hunch” or is interchangeable with “speculation,” or even just “idea.” Not so.

The root word, “theoria,” is a Greek term that comes from the world of the theater. Originally it refers to the place (or seat) you can purchase which affords a view of the stage. The implications of the use of the term to talk about scientific models is that when you have a “theory,” you have a point of view on an object of study that lets you see it fully. It is implicitly a term that weds empirical observation (of the object of theory) with conscious understanding (by the subject who theorizes).

If a science teacher begins class by saying, “today we shall learn about the theory of gravitation,” she is not proposing to discuss whimsical possibilities about what gravity might possibly be. She is not referring to myth, legend, poetry, verse, or song. She is proposing to review those models (mostly mathematical) which have been discovered, tested by experiment, and verified by replication, and thus shown to be useful for giving us a coherent and consistent understanding of certain phenomena we can observe in the world, like a ball falling to the ground.

Biological scientists call evolution a theory, not because they are just guessing, or because there is a serious doubt out there about its validity. Quite the contrary! They call it theory because there’s so very little doubt about the validity of the model. Evolutionary theory is different than gravitational theory; I grant that. It is a composite theory, drawing together model(s) that have been developed to explain phenomena observed in a variety of related sciences (geology, anthropology, archaeology, palaeontology, biology, genetics, etc.). Together these models have given rise to the theory of evolution. It’s called a theory precisely because the model can explain our observations, can predict future observations we might make if we go and look (such as, predicting finding fossils of a certain type of unknown animal in certain layers, because that would be where we expect transitions to be found), and it remains a theory for us precisely because there aren’t any observations, so far, that really fail to fit into the model or call its premises and structure into question.

In my understanding of the history of science, young earth creationism began to die the death of a thousand cuts long before Darwin observed any finches. If I recall correctly, the science of geology was the first to draw blood, as the observations of that science were the first to really draw into question the concept of a 6000 year old earth. Throughout the early centuries of the modern era, new discoveries constantly called rigid and literalist Biblical orthodoxies into question.

Before Darwin, anthropology, particularly driven by reflective encounters with peoples of the new world and the far east, began to draw into question the tripartite theory of human cultural origins that had been assumed in Christendom on the basis of Gen 10-11. Critical philology and archaeology radically undermined confidence in the reliability of the Bible. Text criticism showed that the “revealed” word had been notably unstable through time, and dependent on fallible human tradition for its preservation. Historical “higher” criticism raised questions about authors, sources, cultural contexts, and other human factors in the composition of the texts. Archaeology not only shed light on Biblical narratives, it raised doubts about the full accuracy of every “historical” detail in the Bible. Comparative mythology put the Biblical stories into the light of other ancient accounts of universal origins.

In the hard sciences, Geology led to palaeontology, and the ever expanding observations of the fossil record revealed that different geological ages had been populated by completely distinct biomes. Furthermore, geological paleontology revealed that there were strange and unknown ancient species, now extinct, a finding that very well shocked believers in its day, who believed that God had always willed all kinds to persist, even through a destructive flood. It seemed impossible that God would have created species only to allow them to die out; Darwin’s later embrace of this kind of waste, or excess, as a part of the process of natural selection, was particularly offensive to Biblical theologies. But it perfectly accorded with the observations of the fossil record. It was what Ham would call “observational sciences” that raised so many serious doubts about the validity of a literal understanding of scripture… and this long before Darwin’s ideas began to be taken up, discussed, and so to contribute to the theorization of all the data. If Darwin hadn’t argued what he argued, someone else would have; in fact others did.

From the end of the 18th to the end of the 19th centuries, the humanities and the human sciences thus offered converging accounts of global origins and human history, telling a new story that sealed the fate—or should have sealed the fate— of literalist young earth creationism. This point of view never died out in American Christianity. But it did become the minority position.

It is beyond the scope of any one scholar to master every detail of the Bible. Therefore, those who hold to the full plenary verbal inspiration and “historical” “factual” inerrancy of the bible read literally are quite simply… ill informed. They haven’t even bothered to take up the full depth of the detail in the Bible (not that any one person could do so fully). They are obviously unaware of the many areas in which insoluble tensions among Biblical texts (let alone with the testimonies of external sources, like Assyrian records, for example) point clearly away from any firm notion of “factual” inerrancy. Far from resting epistemologically only on scripture, most fundamentalist believers accept the human testimony of scholars who have, frankly, falsified data and claimed, wrongly, that apparent problems are only that, apparent.

In the real world, the tools we use to measure empirical facts are not infallible (gas gauges break, etc.), but thankfully, we don’t need our tools to be infallible. They just need to work reliably in most instances. If your tools are in general unreliable, you get rid of them, you use different tools.

Now, if you want to know how old the earth is, should you just look in your Bible, where you’ll find your “Answers in Genesis”? Or should you look empirically at all available evidence—textual, archaeological, geological, astronomical—and using the best models available for understanding what you have observed (i.e. those tools we call scientific theories), formulate your best answer that way?

The answer is: use all the available evidence, and strive to understand that evidence using the best available models (theories) for making sense of it.

If in fact you answered “just look in your Bible,” I have three further questions for you. (1) How can you verify that the Bible has authority other than by quoting some part of the Bible itself? (2) What reasons can you give to a person who does not currently accept the authority of the Bible that would convince them that it should have authority? (3) How can you be certain that the way you read the Bible is the right way to read it?

This essay may inevitably be read as an attack on theism and religion, although it is not intended to be such an attack. It could probably get me in trouble with my employers, but I trust they will recognize my good faith in putting this out there.

In my own defense, as a prophylactic against attacks by people who see a straightforward defense of ordinary science (even as applied to the Bible) as somehow wrong, then, let me close with a final observation. The venerable old happy-face TV Evangelist Pat Robertson himself has already now gone on record to condemn Ham’s young earth creationism as beyond the pale of acceptable Christian apologetics.

Elias Isquith, “Pat Robertson Begs Ken Ham to Shut Up: Yes, Even the Radical Televangelist Thinks Young Earth Creationist Ken Ham is Ridiculous,” Salon.com (2/5/2014) URL: http://www.salon.com/2014/02/05/pat_robertson_begs_ken_ham_to_shut_up/.

Actually, as with so many things, this video has to be seen to be believed.

My apologies to all for any typos, misspellings, errors, misrepresentations or whatnot. They are unintentional and I’d be happy to correct myself if I made mistakes.

Corrections and Clarifications: in a comment on Facebook, a colleague (Adrienne Akins) asked me to clarify that it was not Faulkner, but his fictional character Gavin Stevens, who said the past is “not even past.” She also objects to my characterization of Mohler (as being too kind, and inaccurate besides, since others might have a better claim to the title most educated voice in Conservative Christianity) but that’s “my money” at stake there so, for the sake of my rhetorical strategy here, I’ll let that error stand.

Notes on the Editing of this Piece: this post was slightly edited on 2/6/2014.

Five Non-Sexy Strategies to Fix our Federal Fiscal Mess

Americans, it’s time to move on.  Are you tired of dealing with a dysfunctional, ideologically divided government lurching from one factitious crisis to the next?  There is a simple solution and it doesn’t have to do with giving one party or the other “victory” in the he-said-she-said game of legislative brinksmanship.  The solution is to fix the fiscal mess that is the federal budget by acting pragmatically on a long-term plan to reduce deficits and eventually pay down the debt.

Deficit reduction and debt repayment make sense for America.  Why?  Because at root, the partisan bickering that is paralyzing Washington is fueled not so much by ideological disagreements over the role of government in human affairs, but by the messy fiscal reality that is our federal budget.  Our representatives are constantly battling policy priorities and reducing their conflicts to false distinctions in left/right ideology.

What’s more important? National defense or a social safety net?  Both of course!  But in an era of huge deficits, it becomes an either/or fight between so-called “conservative” and “liberal” visions of government.

Our ability to use government to respond to and deal with real world problems is thus truly hampered by our deficit spending and debt.  Existing policies and programs we all love are not financed in a sustainable way, with the result being that we find ourselves in a constant state of legislative warfare.  Congress seems to be locked in an endless showdown over the design and future of important federal programs, with the tedium of bickering being punctuated only by major skirmishes over appropriation bills and debt-limit increases.

This needs to stop.  But it won’t stop by declaring a winner to the ideological fight between “conservative” (or “free market”) and “liberal” (or “socialist”) visions for American government.  It will stop when the people insist on a government willing to adopt long-term, common sense solutions to our fiscal problems.

To that end, here are five decidedly not-sexy, wonkish, but ideologically neutral and/or centrist ideas we could use to fix the federal budget and end this era of fighting over the size of government and the problem of America’s fiscal responsibility.

1) Reinstate a contemporary equivalent to the “PAYGO” rules of the “Budget Enforcement Act” of 1990. PayGo rules were in effect in Washington following the Budget Enforcement Act of 1990 and ultimately, following them helped create the budget surplus that was inherited by George W. Bush. PayGo rules were abandoned after 2002 and one result has been ballooning deficits and debt. Bring them back. It’s easy. For details see: http://www.taxpolicycenter.org/briefing-book/background/budget-process/paygo.cfm and http://en.wikipedia.org/wiki/PAYGO.

2) Pass a “Binding Budget Resolution Act,” a law that would require the Congress annually to adopt a Binding Budget Resolution (aka BBR). This law would differ from current law in two ways. First, current law does not require that Congressional Budget Resolutions actually bind the government to certain spending levels. This law would cause the annual Budget Resolution process to result in a Binding Budget Resolution that would statutorily cap spending at specified levels (emergency spending exempted). Second, this law would provide for a fail-safe mechanism to prevent the congress from failing to enact an annual budget. Current law imposes no penalties if Congress fails to enact a Budget Resolution. Under this law, in the event that the Congress should fail to agree on a BBR and/or the President would not sign the BBR sent by Congress, then, by some specified date in the Fiscal Year (e.g. Sep. 30th), the federal government would be automatically forced to accept a statutory BBR based on the prior year’s actual spending in all areas. The law would automatically authorize appropriations at the level set in the the prior year’s spending for a period of 12 months, and would automatically extend the treasury’s authority to borrow whatever funds were necessary to pay for such spending for 12 months. Thus, in years where no new budget was enacted, the government would not shut down, but would be required to operate at the prior year’s spending levels without any growth or reduction of any existing budgets.

3) Pass a “Termination of Emergency Spending Act,” a law that would cause all appropriations designated as “emergency spending” to automatically expire at the start of each fiscal year. All areas of emergency spending (spending exempt from statutory limitations under the BBR) would require annual review and could only be renewed by a legislative appropriations process requiring redesignation of such spending as necessary to address national emergencies (as defined by Congress). This law would be designed to prevent periods of extended warfare or other emergencies from being used to circumvent the constraints placed by the people on the federal budgetary process.

4) Pass a “Budget Growth Limitation Act,” a law limiting the annual growth in the total size of the federal budget. The act would limit the rate of budgetary growth to 75% of the average rate of growth in the US GDP, as measured over the prior 5 year period (emergency spending exempted). This law would have the effect, over time, of shrinking the size of the federal budget relative to the size of the US GDP. It would also result in a gradual increase in the ratio of government revenues to government expenditures, by ensuring that the underlying economy will be growing at a faster rate than the government. Over time, this will cause federal revenues to grow faster than the budget, leading to deficit reduction and ultimately, surplus. This law would be in effect whenever the total debt to GDP ratio is found to exceed 50%.

5) Pass a “Federal Debt Reduction Act,” a law that would require two things. First, it would require that any reduction of revenues caused by legislative reductions in federal tax rates or other revenue streams be off-set by reductions in expenditures in the Federal Budget. Second, it would require that any budget surplus in federal revenues be used first to pay down the total national debt.  This law would, ideally, preserve any budget surplus that emerged as a result of the new budget process.  It would also force the government to reduce the total debt burden of the US, which would, ultimately, allow the government to do the work of the people with greater flexibility.  This law would be in effect whenever the total debt to GDP ratio exceeds 50%.

These new laws I am modestly proposing here would in no way specify what the budget priorities of the Federal Government should be.  They are unrelated to ideological squabbles over the role of government in human life. They are designed, instead to limit the capacity of legislatures to commit the US Federal Government to expenditures for which the American People are not in fact willing to or capable of paying.

I admit that  I’m just a regular old citizen.  I’m not an expert in political science, or a lawyer; nor do I have experience as a legislator.  These ideas aren’t being thrown out there because I really believe that I have all the answers.  There are probably problems with these proposals that I don’t or can’t see on my own.

But I hope I am smart enough to think for myself and see that there are some basic things our legislature could do that would lead, however gradually, to a period of greater fiscal stability and sanity in the budget of our federal government.   To me, these proposals seem like reasonable ideas that address real problems with our current methods of funding government.

As we address those problems, I hope we can go beyond the current debate, which seems to be polarized between the “raise taxes and increase investment” camp and the “lower taxes and fight socialism” camp.  If policies such as these were adopted, it would certainly create an environment of greater fiscal stability in the US Government.  Perhaps, we could, by standing together within the framework of a more stable budget, hash out our real ideological differences on the role of government in a more civil and productive fashion.

 

 

What if Romney were President?

I heard on NPR today (Wed, Sep. 4th, 2013) an inane report by Alisa Chang that expressed surprise at the fact that the current crisis in Syria has revealed political complexity in the views of Republicans regarding military action.  Some are, gasp, actually opposed to Obama’s plans to attack the Assad regime.  As if Republican opposition to Obama could not have been foreseen.  Chang’s argument turned on noticing Rand Paul’s supposed opposition to the airstrikes, which contrasts well with McCain’s more predictably hawkish point of view:

Paul is one lawmaker in this Syria debate debunking the longheld assumption that the Republican party is the party with the more aggressive foreign policy.  Many conservative and more libertarian Republicans are pointing to their ‘war weary’ constituents.

For a full transcript of the discussion, and the relevant criticisms of Paul, see Senators Rand Paul and John McCain Differ on Syria Strikes : NPR.

Of course, by the end of the day, in spite of opposition, the Senate Foreign Relations Committee rubber-stamped its approval of Obama’s plans to strike, adding only a toothless “no boots on the ground unless we really need them” provision to the resolution.  Yes, some Republicans, notably Marco Rubio and Paul, voted against it.  But that can’t have anything to do with the fact that they are already both implicitly running for President, and hence, against Obama’s record, in 2016, can it?

To test the implicit idea that the Syria “debate” has brought out real complexity in the foreign policy position of Republicans, I propose a quick thought experiment: if Romney were president, how would the current crisis in Syria be handled differently?  My only problem is, once I start thinking like that, I can’t stop myself, and then next thing I know I’m running down the deeper structural problem with the American way of war, all the way down the rabbit hole.

Anyway, let’s do it.  What if Romney were President?

Of course, there would be differences.  And maybe, you might argue, this mess wouldn’t exist in the first place.  Maybe.  But for the sake of argument, let’s assume that it would.

After all, the breakdown in Syria that followed the uprisings in the Arab world that are widely misnamed “the Arab spring” is a result of large, global forces not directly controlled or determined by anything that a sitting US President can do.  Romney could not have stopped the Syrian Civil War.  And I don’t think he can be assumed to have been any more proactive than Obama has been in responding to that war, as it has unfolded over the past two years.

Then there’s the matter of the chemical weapons attack.  Mutatis mutandis, I think even this event can be assumed, for the sake of this thought experiment, to have been inevitable enough to have taken place also on a hypothetical Romney watch.  (Bear in mind that we don’t actually really <span style=”font-style: italic;”>know</span> why they took place; we’ve been told it was Assad.)

The bottom line is this: I want to argue that Romney, like Obama, would face this crisis, and thus would, like Obama, have been forced to respond.

Romney would not, it seems fair to say, have come up with a more creative solution to this crisis than has Obama.  Have you heard a single creative idea about what to do with Syria come from either party?  If Romney had a great idea about what could be done here, surely, he would have shared it with someone … and we all would have heard it by now.  Because we’re desperate for a good idea.  But all we have is “air-strikes.”

No.  Romeny would, like Obama, propose retaliatory strikes.  Surgical strikes. Limited duration punitive air strikes.  It’s not like Obama came up with this idea himself.  The option was proposed and “sold” to him by folks in the National Security establishment.  When all you have is a hammer, everything looks like a nail.  And all that the US has are cruise missiles, fighter jets and Naval vessels.  The proposal to use them here was basically inevitable.

Yes, Romney, like Obama, would likely call for strikes.  The chances that he would do nothing or would continue to delay proposing action seem miniscule.

It also seems likely to me that the only real difference there would be in the partisan and media attention that has been caused by the proposal to strike Syria.  If Romney were President, the reaction of US Lawmakers and opinion makers would be somewhat different.  I don’t know whether Romney would have, like Obama has, sought the approval of congress.  But let’s suppose that he did.  In such a case, surely, Rush Limbaugh would not be so quick to ridicule such a move as weak and incompetent, surely, as he has ridiculed Obama’s decision to seek congressional approval.  Limbaugh’s criticism turns mainly on partisan loyalty.  (Whether he happens to be right this time is besides the point; he would simply NOT use the same level of vitriol if Romney were president; this is predictable; this is known.)    On the other side of the media aisle, because of its clear liberal sympathies, I think it is fairly obvious that the mainstream press would, on the whole, be more openly skeptical about the strikes than they have been under Obama. They would be more likely to publish comments from the “opposition,” i.e. from left-of-center voices who are critical of the proposed strikes (a voice curiously absent in the current reportage, which is focused on “surprising” Republican opposition to Obama’s plans, and former anti-war protester John Kerry’s boosterism).  In congress, things would be also be different.  Under Romney, the Democrats in congress would, it seems likely, be much more willing to oppose the strikes than they appear to be under Obama.  Similarly, Republicans would be more muted in their opposition, if they expressed any opposition at all.

All this is completely predictable, because as we have seen over the past 20 years, literally every “debate” in congress is driven primarily by partisan loyalties, or perceptions thereof, and actual ideological or political debate is miniscule or exists only in caricature.

Now, for the rabbit hole.  Note that all these differences under a hypothetical Romney presidency would change nothing regarding the outcome.  That’s because what is happening right now is way bigger than partisan politics, and it doesn’t seem to matter very much what side of the aisle support or opposition is coming from.  The deck is stacked.  What matters is who holds the cards.  And it ain’t congress, the president, or the media.

Would Romney have called for a congressional vote on the strikes?  Maybe, maybe not.  Does it matter?  No.  Because either way, the strikes would happen, as they will under Obama.  All that congressional approval or lack thereof adds to the mix is a different moral terrain for political posturing for or against the strikes.  Under our system of government, it really doesn’t matter which politicians oppose the strikes, or support them, or why.  What matters is that the strikes are inevitable &mdash; in spite of political misgivings &mdash; unless the citizens rise up and stop the madness by extra-political means.

The real forces that are driving the impending strikes on Syria are deeper and are expressions of economic power-structures.  These structures of capital and industry are the forces that have, over the past 50-60 years, provided the economic incentive for the US to act, over and over again, as the “world’s policeman.”  These structures are related to what Eisenhower called the “military-industrial complex” but which is now better seen as the complex “public-private” partnership that makes government work by tying energy, intelligence, banking, arms-manufacturing and military contracting interests together.  The elites who stand behind this fascist system of state sponsored industries related to security effectively dictate the terms of every aspect of US foreign policy in the interest of controlling the world economy for their own benefit.

I am a firm believer that the office of the presidency changes people, usually for the worse.  We all know that physically, it breaks them down and ages them rapidly.   Occasionally, it has been known to kill them.  Politically, the office breaks every idealistic and rational bone in their political being.  Once he is in office, the president quickly comes to realize that (a) the president doesn’t really make legislation or control spending, and (b) the president doesn’t really set the agenda domestically or internationally.  Sometimes the President can gain the initiative, and push through a reform in one direction or another.  But especially in foreign policy areas, the president is at the mercy of the random forces that sculpt human history, and is also largely a prisoner of a policy culture that was put into place before his time and is now simply being transmitted to him.  The president is a pawn in a huge economic game of world domination, where full-time hard-core “players” make the rules&mdash;or made them long ago.   From the beginning of his term in office, the president will likely be forced into a reactive mode where international events, inherited political culture, and the structure of the US and world economies will more or less dictate his or her responses.

Furthermore, if international events do not cooperate with the elites who need the President to serve their interests predictably, then the perception of such events can actually be manufactured.  Don’t think it hasn’t been done before, or that it won’t be done again.  To a hypothetical Romney as to a real Obama.

Remember when Obama was a critic of the Iraq war and of the use of trumped up intelligence on weapons of mass destruction that got us there?  Remember when he wanted to close Guantanamo?  Do you recognize your supreme leader?

To paraphrase Sarah Palin&mdash;bless her heart&mdash;”how’s that hopey-changey thing working for you now?”

Mythically, this transformation of the idealistic candidate into the game-playing office-holder, the candidate’s loss of full political agency, has often been represented in conspiracy culture through the vehicle of the story of the fabled “first briefing,” in which a newly elected President meets with intelligence agents who share the top-secret truth about aliens, or ancient dread gods of the underworld, or monsters, or the like.  Thus, upon entering office, the former candidate suddenly finds out the awful truth about exactly why he won’t have quite as many options, politically, as he thought he would.

The kernel of truth that has given rise to such myths is plain to see in the unfolding of the Syrian crisis.

Nobody among politicians today really believes that shooting cruise missiles at Damascus will solve Syria’s problems, or stop future Chemical Weapons attacks, or make anyone here or there safer at night.  And yet, we’re planning to do so.  There can be only one explanation for why such a course of action seems like a good idea to those at “the top” of the government.  They’re in a policy bubble, created by the real powers that run the game, in which they are presented with no other viable options.  The game is dominated by the elites whose economic power depends on the continuation of such policies.  For fun, I like to refer to this shadowy group of elites as “General Dynamics,” after the company that manufactures our Tomahawk cruise missiles.  But of course that’s just a metonym.  Its an incredibly complex system of interacting forces that systematically closes off other courses of action before anyone (besides your odd minority of powerless hippies, libertarian wackos, Jesus freaks, and revolutionaries) can even become conscious that real alternatives exist.  Because, as we all know, they really don’t exist, do they?

 

POST SCRIPT SEP 5, 2013:

For an interesting article that parallels many of my points, but with a less paranoid style, read the following: Rosa Brooks, “Obama Can’t Win,” Foreign Policy, 9/4/2013.

The Most Important Outcomes for an Introductory Biblical Studies Course

Listed in order of importance:

  1. Student demonstrates a capacity to look up, find and use the Study Bible’s content: introductory essays, maps, tables, annotations, and biblical books.
  2. Student demonstrates mastery of the system of abbreviations for Biblical books.
  3. Student recognizes, correctly parses and can explain the typographical features of the modern printed Bible including capitalization, spelling and translation conventions, punctuation, use of paragraphs, lines, verses, and chapter divisions.
  4. Student demonstrates a mastery of in-text citation of Biblical writings using the appropriate abbreviations, chapter, and verse numeration system.
  5. Insofar as they are known and widely agreed upon, the student can describe the historical circumstances of the origins of the various Biblical writings, correctly identifying relevant time periods, geographical regions, languages, cultural contexts, and ethnic populations associated with those origins.
  6. In formal writing, student can paraphrase and analyze the structure of the contents of major sections of Biblical books, properly citing the referred passages.
  7. In formal writing, student can effectively quote and cite from relevant biblical texts, demonstrating a proper understanding of the literary and historical context of quoted material.
  8. In formal writing, the student can properly cite and use the course textbook to provide relevant supporting information.
  9. Student correctly identifies the main historical and political events which coincide with the context of the origins of Biblical writings and also with the content of biblical stories and can construct a timeline integrating data about significant persons, events, empires, wars, periods, and books.

the exhaustion of criticism and “pseudo-modernism”

A recent entry in this blog entitled “the exhaustion of criticism” (published July 9th, 2011) accused academic critical studies in general, and Biblical Criticism in particular, of exhausting itself (and its potential readership), to the point of a complete disciplinary unraveling.

I do believe that scholars working in the Humanities (Philosophy, Theology, Religious Studies, History, Cultural and Media Studies, Art and Literary Criticism, and cognate fields) face a dangerous threat of exhaustion unto death. Attribute the situation to any number of factors.

During the past century and a half the earth’s population has exploded, leading to a proliferation of new philosophies, theologies, religions, historical investigations, and artistic and literary productions. The sheer volume of potential objects for critical activity would overwhelm the capacities of any particular scholar.

Of course, disciplinary mastery has always been elusive, but today, it is frankly impossible. Given that the number of potential contemporary objects of study far exceeds the (already numerous) important objects that have survived from previous centuries and from antiquity, the balance of work to be done must focus on “contemporary” materials. And so, in this context the idea of a classical canon of important works is increasingly hard to defend to an academic audience — forget about successfully selling the canon to the next generation of readers, distracted as they are by hundreds of cable channels, tens of thousands of games, millions of websites, and the myriad options for textualized and mediated connectivity in the world of social media.

Coincident with this proliferation of potential objects of criticism, and the “contemporization” of discourse, there has been an inevitable fragmentation of academics into ever more numerous and narrow specializations. Increasingly, intellectuals find themselves separated into camps, pockets, and subcultures, talking either at cross purposes, or on parallel, non-convergent lines. Within the subdisciplines, the proliferation of work forces academics to turn from the objects that should properly occupy their energies and towards bibliographic and pedantic analysis of scholarship. Scholarship becomes scholarship on scholarship, and it seems ever less likely that we can expect synthetic work to emerge or find a broad, popular audience. Nobody is listening.

At the same time, economic practices have shifted dramatically, so that scholars are no longer in the economically privileged position of their great grandfathers. Scholarship is not a field dominated by men with wives and nannies for the children, maids for the home, cooks for their food and secretaries for their paperwork. Most scholars today lack such economic supports for their life of the mind. Instead, we labor along other servants in the service economy, being extremely lucky to have our summer vacations as the main thing that sets us apart from other typical dual-income middle middle class suburban/urban households, where children, housework, and the worries of everyday life constantly undermine “serious work.”

And beyond this, while a few exalted academics at elite schools still enjoy the leisure and economic support they need to do “research” (All Souls College at Oxford University springs to mind… it still exists), most scholars are teaching more credit hours, to more students, for less pay, with fewer teaching assistants, all the while dealing with a even greater emphasis on the culture of bureaucratic oversight for their work. Try saying the phrase “outcomes assessment” to any college professor you know and watch his or her brain start to melt inside the skull. It’s amusing.

Such mundanities are certainly not the only set of distractions for middle aged scholars like myself. We operate in the same informational matrix as our students; social media bombards us with the constant recommendations of (or banal updates from) friends, our work email inboxes overflow with “carbon copied” announcements of events and other chatter. My generation was the first to work through school with the burden of knowing about the psychological problem called “ADHD”; but this disorder seems to have been robustly adopted as a typical cognitive paradigm.

These words of mine today were inspired by a recent essay in Philosophy Now, on the “Death of Postmodernism.” I think if the essay is read correctly, it matters very little whether its author, Alan Kirby, may be correct that “Postmodernism” is a useful scholarly rubric or even a live movement in art and culture. (The “postmodern” already seemed like a dead issue when I first heard the term and tried to read the impenetrable postmodernists back in sophomore year of college, in 1989; my friends and I instead suggested that we should just be “postfuturists” and get over ourselves already). What seems important, instead, is to notice how exhausted Kirby sounds, as if he simply cannot fathom how we will ever, as scholars, come to terms with the technological transformations of text and reading practices that he so deftly identifies. He calls the new way of creating and reading texts “pseudo-modern,” which stretches the idea of “modernity” well past the breaking point, as far as I am concerned. Like me, he sees a bewildering variety in the modern show, text, and game, and also like me, he despairs at the undeniably vapid and shallow nature of it all. The fact that Kirby’s analysis turns on an apparent generation gap (speak for yourself, sir!) between today’s teachers and the supposedly different readers and consumers that make up today’s students highlights the same issue that I raise above: faced with ever expanding ranks of junior human beings, with their strange ways of talking and their unfathomable tastes in music and art, many of us in older generations are apt to freeze like the proverbial deer in the headlights.

I’d keep working on this little rant and give my suggestions about how I think we ought to deal with this desperate situation of our exhaustion, but I have to stop writing and take care of my 21 month old son, James.

Republicans are crazy about taxes

Republican leaders weathered the recent debt ceiling debate with aplomb, fiercely and tirelessly resisting all calls for increases in Government Revenues through new taxes. Their opposition to new taxes is so absolute, so unrelenting, that it effectively rests on a total renunciation of belief in the legal morality of taxation. Republicans would be more philosophically consistent if they simply admitted that they hate all taxes and that the very idea of taxes appalls them.

But such a position amounts to support of no funding at all for government. It would be crazy for American politicians, who are paid handsome salaries and given handsome benefits at taxpayer expense, to oppose all funding for government. Right?

Lapore is wrong about meaning in Poetry

Ernie Lapore, “Poetry, Medium, and Message.” The Stone. New York Times Online. 7.31.2011.

Rutgers philosopher Ernie Lepore writes about poetry in yesterday’s installment of “The Stone,” a philosophy “blog” on the New York Times. Something about the article rankled me and inspired this cranky response.

Lapore says the New Critics locate meaning, and the resistance of poetry to both translation and explanation, in a magical (or mystical) property of the words themselves, but his critique describes a pomegranate, examines an orange, condemns an apple, then recommends something that looks to me just like a pomegranate. Apples to oranges, Lapore.

He confounds words like “meaning” “interpretation” “translation” and “paraphrase” as if their mutual substitution could be accomplished without qualification. In general, he seems to think definitional meaning resides at the level of the word, or nearly so, and does not discuss combinatorial syntax, figure, image, denotation, connotation, nor manifest versus latent content. No mention of cultural identity, class, historical and geographical factors, codes or other vexed subtleties of discourse. Apparently he wrongly assumes these do not matter to his argument. Perhaps he thinks they can be easily disposed of in phrases like … “Linguistic expressions mean whatever they mean wherever they occur” — but that is just silly.

His article takes on the problem of explaining the truth behind the persistent ‘heresy of paraphrase’ … beginning with a witticism from Eliot … literalism, I’d call Eliot’s quip … as if New Critics all just thought that poetry means only what is said. He conflates the idea with translators’ complaints that languages resist full translation. (If languages did not resist translation, we’d all be effortlessly multilingual). And mixes in the notion, one I find typical of undergraduate writers, and unworthy of a philosopher, that interpretation can be accomplished by plot summary.

He ends up defending a simple distinction between articulation (“perceptible” or measurable qualities of the presentation of words) and lexical meanings. Poetry brings intentional articulation in to play, and that’s why it can’t simply be paraphrased.

He writes: “Of course, we can introduce a new expression to mean exactly whatever an old expression means but since poems can be about their own articulations, substituting synonyms will not result in an exact paraphrase or translation. To do so requires not only synonymies but also identical articulations, and only repetition ensures this end.”

Notice here that what he calls articulation is reserved by definition from “meaning” … but this simply cannot be the case.

He says “the poet wants to draw the audience’s attention to … articulations as much as to the ideas the words so articulated express” but also says poetry differs from prose only in that it can be more “about” its articulation.

Once you’ve admitted that the poet can take language and make it about its articulation, you’ve committed yourself to a robust and complex idea of meaning. It doesn’t matter if you jest, like Elliot, or dismiss complexities with a vapid gloss like “linguistic expressions mean whatever they mean wherever they occur, but in poetry (as in other forms of mentioning) the medium really becomes the message.”

No poem can “be about” something other than its meaning; meaning is the aboutness of discourse.

A paragraph or so earlier Lapore had blasted New Critics for claiming that “form shapes content” [his italics, ironically] a notion he mocks as both “quasi-mystical” and “magical” … utter nonsense. Only the color blind, the tone deaf, and the naive would insist upon the idea.

A poem’s unique articulations have been brought into its meaning; manifest features manifestly shape the content of discourse. Meaning cannot be found in a dictionary or thesaurus, and you cannot even look up the meaning of poems in books.

I think Lapore is right on the money to emphasize the importance of articulation as a key way of distinguishing poetry from prose; of course an older age was content to call this music, and under his analysis, we can still call it music.

Lapore implies that an interpretation and a summary of a poem amount to the same thing, and then ends up defending a ridiculous version of ‘the heresy of paraphrase’ on the allegedly new grounds that summaries leave out the music. He doesn’t actually care to interpret poetry, so he seems not to care without caring that his conflation of paraphrase and meaning would make interpretation, by his own account, more or less impossible for more “poetic” texts. Again, nonsense. (Also nonsense: that ‘interpretation’ is restricted to discovering “meaning” on his or my terms).

Meaning must be sought afresh in every utterance, as a dog finds water in a bowl, as a parent places a hand on a child’s forehead, as a student looks at the clock. Meaning is not strictly lexical or referential; it is critical. A worker opens the days newspaper, or a believer approaches scripture. All meaning is phenomenon; it resides in the lived, temporal, historical present — the appearance or presentation — of communicative action. This necessarily includes linguistic and paralinguistic features.

To talk about meaning in things, among other things and their meanings, that’s interpretation.

Beware of pomegranate seeds.

the exhaustion of criticism

The last lines of Qohelet (Ecclesiastes, aka “The Teacher”) are those of an editor, who advises us to value the straightforward words of a single wise person far more than all the many books produced by the scholars and seekers in the world; he laments: “of making many books there is no end, and much study is a weariness of the flesh” (Eccl. 12:12).

One single reliable authority can surpass a mass of experts? If only! If only such authority existed.

For the purposes of this little rant there is no reason to try to justify the unjustifiable. So let me state at the outset that I will go ahead and join prophecy to wisdom. My vision is for Biblical Studies.

When the history of the present era of Scholarship on Biblical Literature is written, one hundred, or two hundred years from now, it will likely be seen as an era of exhaustion. This is the era of criticism’s exhaustion. And of the exhausted critic.

Dominated by the exhaustive literature review, the exhaustive review of evidence, the exhaustive survey of theories, perspectives and interpretations, scholarship today puts the greatest and most elite scholars to work with great energy cataloging, compiling and critiquing … their own contribution to the subject of study. The bibliography and annotations are an art form unto themselves. Year by year, there is an endless parade of published work on subjects that interest Biblical Scholars… while the subjects themselves vanish in a fog of vague words about the published scholarship. No one has time to read any body else deeply, because there are too many conversation partners. And the primary sources are not read deeply, closely, or creatively enough.

We are examiners of interpretations. We splice and winnow and classify, anxious to include all, to miss nothing, to note precedent. We cannot proceed with any interpretation apart from the examination of other interpreters. We map carefully the entire landscape of critical discussion, and only then stake our own place in the field, often defining it by the slimmest and most subtle differences of discourse.

Acrobats of research, we ascend mountains of books, great edifices of commentary, surmounting the work of others by means of the slenderest toe holds and hand holds of criticism, on virtual cracks, muscling our way past overhanging cliffs of redundant reviews of reviews of reviews of evidence, determined to contribute but with no real sense of what needs doing. We climb it because it is there.

If this were all, it would be enough, but it is far from all. Having exhausted traditional areas of interpretation, Biblical Studies defines new fields, which are much more difficult to exhaust, such as the theoretically limitless project of Wirkungsgeschichte. Everything and anything can be subjected to Wirkungsgeschichte. It is endless. After the canonical, there is the pericanonical, heterocanonical, apocryphal, novelistic, polemic, heresiological, apologetic, historical…. there is art and architecture, music and graffiti, artifacts and locations, flim, television, the internet, text messages… there is no boundary. The scholar cannot relax, because ars longa vita breva.

 

Book after book is published. Yet, all literary art is dead. The scholar apologizes, as if for a fart, after a moment of levity is emitted into print. A well turned phrase is buried beneath a jumbled pile of shallow annotations. There are only a couple of poets, and they are writing about narrow, technical things.

Consider, as well, that greatest of achievements itself. The Bibliography. The longest, greatest bibliography — howsoever long or great — if it be on a specialized area of research, it resembles nothing so much as a chronicle of oblivion.

Why are we not more wary of publishing?

For each of us, in the end, after all, there is no ars, there is only breva vita. In this life there is limited energy available for scientific investigation — and family, friends, food, actual art, and everything else too. At journey’s end you must leave your learning behind you. It is useful to remember this, young scholar. There is no time for idle questions. Make sure your questions are interesting enough that the answers will matter after you are dead and gone. And all the usual advice about style applies too.

weiner public tweeter

Well, you know, when you mean to send a private message and it ends up going on your main twitter feed… that’s pretty embarrassing. When it’s a scandalous message… and gets attention, and you thereafter claim that someone else did it, that you were hacked, when you blame the muckraking blogger who reported the issue … that’s pathetic lack of responsibility. Public fail = public fail. It’s not private when it’s on twitter (instant transmission to 50k followers, both friends and enemies!) and it’s not private when you have to hold a press conference to discuss your fail, because you aren’t a private person, you are a public figure. We have seen, over the past several years, a number of politicians fall prey to these social media driven scandals (emails, text messages, facebook, twitter) and there’s apparently been a lesson lost. Personal technology, Social Media, web 2.0, all this has moved the “private” into a public space. If you wouldn’t want your husband/wife, grandmother, and mother to read it, you probably shouldn’t EVER type it on any kind of electronic device. Period. “There is nothing hidden that will not be revealed.” Truer than ever.

%d bloggers like this: