Tuesday, March 20, 2018

Marketing and Endanged Colleges Colleges: An Endangered Species? Andrew Delbanco MARCH 10, 2005 ISSUE Stover at Yale by Owen Johnson Ross and Perry, 386 pp., $34.95 (paper) The Future of the Public University in America: Beyond the Crossroads by James J. Duderstadt and Farris W. Womack Johns Hopkins University Press, 236 pp., $39.95, 21.95 (paper) The Uses of the University by Clark Kerr (fifth edition) Harvard University Press, 261 pp., $22.50 (paper) Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education by David L. Kirp Harvard University Press, 328 pp., $29.95, 17.95 (paper)



Image result for Colleges in america

1.

Every middle-class American family with a college-age child knows how it goes: the meetings at which the high school counselor draws up a list of “reaches” and “safeties,” the bills for SAT prep courses (“But, Dad, everyone takes one; if you don’t let me, I’m screwed”), the drafts of the personal essay in which your child tries to strike just the right note between humility and self-promotion—and finally, on the day of decision, the search through the mail in dread of the thin envelope that would mean it’s all over and that, as a family, you have collectively failed.
The struggle to get into America’s leading colleges is, of course, the dark side of a bright historical development. Until about fifty years ago, our most prestigious academic institutions were pretty much the domain of well-born prep school boys. In 1912, Owen Johnson’s enduringly popular novel (most recently reprinted in 2003) Stover at Yale gave a picture of Ivy life as a gladiatorial contest among alpha males who, by beating out their rivals for a spot on the team or in the club, learned to achieve “victory…on the broken hopes of a comrade,” and went on to rule the nation. In 1920, Scott Fitzgerald (Princeton ’17) called Stover at Yale the “textbook” for his generation. Writing a few years later about Harvard in his novel Not to Eat, Not for Love, George Weller remarked on “how similar the faces always looked in the Varsity picture, except where there was an Irishman or a Jew, and even then they seemed somehow anglicized down toward alikeness.” In Weller’s novel, one Harvard bureaucrat runs a brisk business selling “the addresses of selected Anglo-saxon sophomores to the mothers of Boston débutantes” lest some “anglicized” Irishman or Jew pass for a WASP and, by means of an unwary Beacon Hill belle, contaminate the race. As late as the outbreak of World War II, these fictions had the plausibility of fact.
At the turn of the century, when Stover was prepping for Yale, fewer than a quarter-million Americans, or about 2 percent of the population between eighteen and twenty-four, attended college. By the end of World War II, that figure had risen to over two million. In 1975, it stood at nearly ten million, or one third of the young adult population. Today, the United States leads the world by a considerable margin in the percentage of citizens (27 percent or 79 million) who are college graduates.1
To advance this immense social transformation required many means—notably the GI Bill, passed by Congress in 1944, which brought onto America’s campuses students whose fathers could have set foot there only as members of the janitorial class. Starting a decade later, the Ivies did their part by establishing “need-blind admissions” and “need-based financial aid”—by which they promised to accept qualified applicants regardless of their ability to pay, and to help support needy matriculants by assessing family assets and making up with scholarship aid whatever the family could not afford. A new system of standardized testing (the SAT) identified talented students, many of whom were subsidized by federal programs designed to train scientists and strategists for the postwar struggle against communism. It was during the cold war that the Old Boy with his Rudy Vallee (Yale ’27) intonation and “Gentleman’s C” became an anachronism.
Progress in the public universities was equally striking. By 1960, the University of California at Berkeley was challenging Harvard in accomplishment and prestige, and the “flagship” branches of other state universities such as Michigan, Ohio, Wisconsin, and, more recently, Texas and North Carolina joined the ranks of the world’s leading institutions. Later in that decade, in part because of competitive pressure from public rivals, Yale and most of its private peers opened their doors to women, and the push was on to recruit students from “underrepresented” (to use today’s stipulated bureaucratic language) minority groups. In short, by the late twentieth century the America that Tocqueville had described 150 years earlier as a nation where “primary education is within the reach of everyone” but “higher education is within the reach of virtually no one” had been turned upside down.2
This mostly happy story is well known. Less well known is the most recent chapter, which tells of a slowdown, if not reversal, of the trend toward inclusion. Over the last twenty-five years, as the tax revolts of the 1970s (starting in 1978 with California’s Proposition 13) became chronic tax resistance, state support for pub-lic universities has sharply fallen. Public funds now cover less than one third of expenses at public universities—with the result that tuitions are rising beyond the reach of families who once would have depended on these institutions as pathways of upward mobility.3Tuition at the State University of New York jumped by 28 percent between 2002–2003 and 2003–2004—the price of attending is now over $14,000 (tuition, housing, and fees)—and other state systems saw similar increases. The average tuition increase at all public universities last year was 10.5 percent, four times the rate of inflation.
Calling these schools “predators” and “carnivores,” Duderstadt warns that public universities may have to “unleash the T word, tax policy, and question the wisdom of current tax policies that sustain vast wealth and irresponsible behavior at a cost to both taxpayers and to their public institutions.” As if to underline his point, The New York Times reported a few months ago that part of the fees paid by institutional investors to a management firm spun off from the Harvard Management Company (the nonprofit entity that oversees Harvard’s $22 billion endowment) has been going to Harvard as tax-exempt revenue.4
At the Ivy League colleges, where financial aid was previously awarded strictly on the basis of need, “merit aid” is increasingly used to recruit especially desired students who may not need the money—with the result that less money is available for students who do. And applicants are stampeding toward early admissions programs that offer, in exchange for a promise to attend if admitted, a better chance of getting in. These programs, which now account for roughly half of all enrolled students in the Ivy League, favor candidates from private or suburban schools who have well-connected counselors (sometimes privately hired) and the financial freedom to pick a college without waiting to compare financial aid offers—and the colleges know it.5
In academia, in short, no less than in other privileged corners of American life, money is being funneled into the hands of a relative few. Once-shabby college towns have become boom towns where old dives remembered fondly by alumni are now upscale restaurants to which today’s students bring their high-limit credit cards, and parking lots are crowded with student SUVs.
The recent history of elite higher education is usually told as a glorious story of democratization. But future historians may look back and see something different: a restrictive age of old money (1900–1950), followed by an interregnum of broadened access (from the 1950s into the 1980s) and then a period (circa 1990–?) in which new money poured in. America’s colleges continue to serve extraordinary students from many backgrounds, but Stover’s drinking song is being sung again:
Oh, father and mother pay all the bills,
And we have all the fun,
That’s the way we do in college life.
Hooray!6

2.

Amid these troubling developments, one hopeful sign is the growing public debate over who should go to college and how they should be paid for.7 Yet one hears comparatively little discussion of what students ought to learn once they get there and why they are going at all. Over my own nearly quarter-century as a faculty member (four years at Harvard, nineteen years at Columbia), I have discovered that the question of what undergraduate education should be all about is almost taboo.
Most American colleges before the Civil War (more than five hundred were founded, but barely one hundred survived) were, in Richard Hofstadter’s words, “precarious little institutions, denomination-ridden, poverty-stricken…in fact not colleges at all, but glorified high schools or academies that presumed to offer degrees.” A bachelor’s degree did not have much practical value in the labor market or as a means of entering the still-small managerial class. The antebellum college was typically an arm of the local church—an academy for ministers, missionaries, and, more generally, literate Christians—that remained true to the purpose of the oldest American college, Harvard, which had been founded in dread “lest the churches of New England be left with an illiterate ministry…when our present ministers shall lie in the dust.”
As sectarian fervor cooled, the colleges became less closely tied to the churches, though most retained a strong religious tone through the mid-1800s. Whatever the particular method or creed, there was consensus, in “an age of moral pedagogy,” that the primary purpose of a college education was the development of sound moral character.8 A senior-year course in moral philosophy, usually taught by the college president, was almost universal. As the grip of religion loosened further over the course of the century, and the impact of Darwin transformed intellectual life, colleges changed fundamentally, becoming largely secular institutions devoted less to moral education than to the production and transmission of practical knowledge.
By the mid-nineteenth century, the need for expert training in up-to-date agricultural and industrial methods was becoming an urgent matter in the expanding nation, and, with the 1862 Morrill Act, Congress provided federal land grants to the loyal states (30,000 acres for each of its senators and representatives) for the purpose of establishing colleges “where the leading object shall be, without excluding other scientific or classical studies, to teach such branches of learning as are related to agriculture and the mechanic arts.” Eventually these “land-grant” colleges evolved into the system of state universities.
At the same time, as the apprenticeship system shrank and some professional careers began to require advanced degrees, the impetus grew for the development of private universities. Some took shape around the core of a colonial college (Harvard, Yale, Columbia), while others (Chicago, Northwestern) came into existence without any preexisting foundation. Still others (Clark, Johns Hopkins) had at first few or no undergraduate students. In 1895, Andrew Dickson White, the first president of Cornell, whose private endowment was augmented by land granted to New York State under the Morrill Act, looked back at the godly era and declared himself well rid of “a system of control which, in selecting a Professor of Mathematics or Language or Rhetoric or Physics or Chemistry, asked first and above all to what sect or even to what wing or branch of a sect he belonged.”
The idea of practical and progressive truth to which the new universities were committed was, of course, not entirely novel. It had already been advanced in the eighteenth century by Enlightenment ameliorists such as Benjamin Franklin, who anticipated a new kind of institution, to be realized in the University of Pennsylvania, that would produce “discoveries…to the benefit of mankind.” Roughly one hundred years later, Charles W. Eliot, the president who turned Harvard College into Harvard University (and who was himself descended from Puritan clergy), explained that a modern university must “store up the accumulated knowledge of the race” so that “each successive generation of youth shall start with all the advantages which their predecessors have won.”
By 1900, professors, no less than physicians or attorneys, had become certified professionals, complete with a peer review system and standards for earning credentials—which one of Eliot’s faculty members, William James, referred to as the “Ph.D. octopus.” Faculty began to benefit from competitive recruitments in what was becoming a national system of linked campuses; and when some rival university came wooing, the first thing to bargain for was, of course, a reduced teaching load. Seven years after Eliot’s inauguration speech in 1869, the Harvard philologist Francis James Child was exempted from grading undergraduate papers in response to an offer of a job from Johns Hopkins.
By the end of the nineteenth century, the professionalized university had absorbed schools of medicine and law that had typically begun independently, and was acquiring teacher-training schools, along with schools of engineering, business, and other professions. It was on its way to becoming the loose network of activities that Clark Kerr, president of the University of California, famously called the “multiversity.” When Kerr coined that term in 1963, in The Uses of the University, he remarked on the “cruel paradox” that a “superior faculty results in an inferior concern for undergraduate teaching,” and he called this paradox “one of our most pressing problems.”
Since Kerr wrote, the problem has gotten worse. Today, as David Kirp points out in Shakespeare, Einstein, and the Bottom Line, New York University, which has lately made a big (and largely successful) push to join the academic front rank, employs “adjunct” faculty—part-time teachers who are not candidates for tenure—to teach 70 percent of its undergraduate courses. The fact that these scandalously underpaid teachers must carry the teaching burden—not just at NYU, but at many other institutions—speaks not to their talent or dedication, but to the meagerness of the institution’s commitment to the teaching mission. At exactly the time when the struggle to get into our leading universities has reached a point of “insane intensity” (James Fallows’s apt phrase), undergraduate education has been reduced to a distinctly subsidiary activity.9

3.

Under these circumstances, one might expect to see students fleeing to colleges whose sole mission is teaching undergraduates. Fine colleges such as Swarthmore, Amherst, and Williams, which have significant endowments and high academic standards, do indeed have considerable drawing power. Yet these are small and relatively fragile institutions, and even the best of them are perennial runners-up in the prestige game, while other impressive colleges—such as Centre College in Kentucky or Hendrix College in Arkansas—must struggle, out of the limelight, to compete for students outside their region.
The leading liberal arts colleges will doubtless survive, but they belong to an endangered species. Michael S. McPherson, president of the Spencer Foundation and former president of Macalester College, and Morton O. Schapiro, president of Williams, report that even now “the nation’s liberal arts college students would almost certainly fit easily inside a Big Ten football stadium: fewer than 100,000 students out of more than 14 million.”10 In today’s educational landscape, barely one sixth of all college students fit the traditional profile of full-time residential students between the ages of eighteen and twenty-two. One third of American undergraduates now work full-time, and more than half attend college part-time, typically majoring in subjects with immediate utility, such as accounting or computing. These students, and their anticipated successors, are targets of the so-called electronic universities that seek a share of the education market by selling Internet courses for profit. A few years ago, the president of Teacher’s College at Columbia University predicted that some wily entrepreneur would soon “hire well-known faculty at our most prestigious campuses and offer an all-star degree over the Internet…at a lower cost than we can.”11
As for the relatively few students who still attend a traditional liberal arts college—whether part of, or independent from, a university—what do they get when they get there? The short answer is freedom to choose among subjects and teachers, and freedom to work out their own lives on campus. Intellectual, social, and sexual freedom of the sort that today’s students assume as an inalienable right is never cheaply won, and requires vigilant defense in academia as everywhere else. Yet there is something less than ennobling in the unearned freedom of privileged students in an age when even the most powerful institutions are loath to prescribe anything—except, of course, in the “hard” sciences, where requirements and prerequisites remain stringent. One suspects that behind the commitment to student freedom is a certain institutional pusillanimity—a fear that to compel students to read, say, the major political and moral philosophers would be to risk a decline in applications, or a reduction in graduation rates (one of the statistics that counts in the US News and World Report college rankings closely watched by administrators). Nor, with a few exceptions, is there the slightest pressure from faculty, since there is no consensus among the teachers about what should be taught.
The history of American higher education amounts to a three-phase story: in the colonial period, colleges promoted belief at a time of established (or quasi-established) religion; in the nineteenth century, they retained something of their distinctive creeds while multiplying under the protection of an increasingly liberal, tolerationist state; in the twentieth century, they became essentially indistinguishable from one another (except in degrees of wealth and prestige), by turning into miniature liberal states themselves—prescribing nothing and allowing virtually everything.12 Anyone whose parents or grandparents were shut out from educational opportunity because of their race, ethnicity, or gender is thankful for the liberalizing trajectory of higher education—but as in every human story, there is loss as well as gain.
—This is the first of two articles.
  1. 1
    Henry Rosovsky, The University: An Owner’s Manual (Norton, 1990), p. 12; Richard Vedder, Going Broke by Degree: Why College Costs Too Much (American Enterprise Institute, 2004), p. 111. 
  2. 2
    Alexis de Tocqueville, Democracy in America (1835), translated by Arthur Goldhammer (Library of America, 2004), p. 58. 
  3. 3
    James J. Duderstadt and Farris W. Womack, The Future of the Public University in America: Beyond the Crossroads, p. 16. 
  4. 4
    Stephanie Strom, “Harvard’s Invisible Fund-Raising,” The New York Times, July 25, 2004. 
  5. 5
    Ronald G. Ehrenberg, Tuition Rising: Why College Costs So Much (Harvard University Press, 2000), pp. 78–80, explains how, since 1992, when the US Justice Department forced Ivy League institutions to cease collaboratively setting financial awards to commonly admitted applicants on the basis of need, a price war has broken out, exerting inflationary pressure on tuition. Awards to students with multiple offers go up while students in a less advantageous bargaining position may receive less aid than they need—especially from the less wealthy schools. On the effect of early admissions programs, see The Early Admissions Game: Joining the Elite (Harvard University Press, 2003). Based on data drawn from half a million applications to fourteen highly selective colleges, Christopher Avery, Andrew Fairbanks, and Richard Zeckhauser confirm that “applying early provides a significant admissions advantage, approximately equivalent to the effect of a jump of 100 points in SAT-1 score.” Contrary to the public claims of many colleges, “early applicants tend to be slightly weaker…than regular applicants.” The authors also point out that “since Early Decision admits cannot apply to other colleges, they forgo any leverage they would gain by documenting the financial aid offered by other schools. Each ED admit saves money for a college (on average), and also…saves the college considerable headache by precluding the possibility of such negotiations over financial aid in the spring.”  
  6. 6
    The literature on the high cost of college is large and contentious. Conservatives argue that academia has a “productivity problem”—since students take longer on average to earn their degrees than they did in the past, but are not demonstrably better educated (Vedder, Going Broke by Degree, p. 59). Liberals respond, as former Dartmouth president James Freedman puts it in Liberal Education and the Public Interest (University of Iowa Press, 2003), that “higher education is expensive because it is a labor-intensive activity in which the common devices for increasing productivity—larger class sizes, reductions in faculty, increases in faculty teaching loads, delayed replacement of laboratory equipment, diminished purchasing of library books, deferred maintenance of facilities—compromise an institution’s quality.” Our two richest universities, Harvard and Princeton, have recently announced measures to make themselves more affordable by eliminating loans in favor of grants (Princeton), and by no longer requiring that students with family income below $40,000 contribute anything to the cost of their education (Harvard). But former Princeton president William Bowen, now president of the Andrew W. Mellon Foundation, doubts that these actions will have much effect on the student mix. Bowen argues that only an “affirmative action” approach, i.e., preferential admissions for economically disadvantaged students, will make a difference. (Bowen discusses the issue in the second of his 2004 Jefferson lectures, delivered on April 7 at the University of Virginia, “The Quest for Equity: ‘Class’ (Socio-Economic Status) in American Higher Education,” available on the Mellon Foundation Web site, www.mellon.org.) There is reason to believe that Bowen is right. At Harvard, for instance (as reported in the Harvard Crimson on November 23, 2004), fewer than 15 percent of students come from families earning under $60,000 per year. 
  7. 7
    Ehrenberg’s Tuition Rising is a much less polemical book than Vedder’s Going Broke by Degree. Also helpful is Michael S. McPherson and Morton O. Schapiro, The Student Aid Game: Meeting Need and Rewarding Talent in American Higher Education (Princeton University Press, 1998). 
  8. 8
    Donald H. Meyer, The Instructed Conscience: The Shaping of the American National Ethic (University of Pennsylvania Press, 1972), p. 68. 
  9. 9
    James Fallows, “The Early-Decision Racket,” The Atlantic Monthly, September 2001, p. 37. 
  10. 10
    Liberal arts colleges are customarily defined as moderately selective, with most if not all students living on campus, and at least 40 percent majoring in some liberal subject such as English, foreign languages, biological sciences or physics, mathematics, philosophy, religion, psychology, social science, visual or performing arts, or ethnic studies. By these criteria, the number of such institutions is roughly 230, and virtually all face serious financial challenges in maintaining laboratories and libraries, competitive faculty salaries, and financial aid programs for needy students. See McPherson and Schapiro, “Economic Challenges for Liberal Arts Colleges,” in Distinctively American: The Residential Liberal Arts College, edited by Steven Koblik and Stephen R. Graubard (Transaction, 2000), pp. 47–75. 
  11. 11
    Arthur Levine, “The Soul of a New University,” The New York Times, March 13, 2000. In fact, it was al-ready happening: between 2000 and 2003, Columbia University (of which Teacher’s College is an affiliate) invested some $40 million in a futile effort to capture what the university provost called “the high-end, high-quality distance learning marketplace.” By the time that venture was shut down in early 2003, it had generated only $700,000 in revenue. (See Kirp, Shakespeare, Einstein, and the Bottom Line, pp. 172–175.) The problem, as Thorstein Veblen had understood nearly a century earlier, is that the university’s main marketable commodity is prestige, and the student consumer gains prestige from being among the select few permitted to enter through the university’s ancient gates, not by logging in through an electronic portal along with anyone willing to pay. “The exigencies of prestige,” Veblen wrote prophetically in 1918 in The Higher Learning in America: A Memorandum on the Conduct of Universities by Business Men (reprinted by Hill and Wang in 1957), “will easily make it seem more to the point, in the eyes of a businesslike executive, to project a new extension of the plant; which will then be half-employed, on a scanty allowance, in work which lies on the outer fringe or beyond the university’s legitimate province.” 
  12. 12
    Important exceptions to this generalization are, of course, institutions that retain a strong religious affiliation. For a thoughtful account of how a university, in this case a law school at a Catholic university, can retain a sense of religious mission while being open to students of any, or no, faith, see Mark Sargent, “An Alternative to the Sectarian Vision: The Role of the Dean in an Inclusive Catholic Law School,” University of Toledo Law Review, Fall 2001, pp. 171–188. Sargent is dean of Villanova Law School. 

No comments:

Post a Comment