A few words about the importance of text. But first, a story. Fifteen years ago, Ilicia Stangle applied to Penn as an early decision candidate. Stangle was a girl from distant Albuquerque, N.M., who felt qualified to attend this university. But Stangle wasn't naive: She knew that many applicants are qualified and that many have the high scores needed to get in. So Stangle wanted to do something extra to separate her application from everyone else's.
She added some text.
Stangle paid for an advertisement to run for a week in The Daily Pennsylvanian. It was one inch by two columns and read: "I applied early decision to the University of Pennsylvania, and I really want to be admitted."
She was.
Of course, the Admissions Office claimed it was going to accept Stangle anyway. But the director of admissions planning, Cristoph Guttentag, did say the ad helped Penn form a better picture of Stangle.
"Like everything students do, [the ad] was in some way reflective of [her] personality," he said.
A decade and a half later, Stangle's story holds a lesson: Academia has become infested with numbers. Magazines rank every university. Universities rank every applicant's high-school GPA. And here at Penn, students rank every class and professor on a scale of zero to four.
Good old copy is disappearing, taking away the words which contextualize the numbers and reflect real personality.
You may have noticed the trend this past week, when classes began to fill in bubble sheets for the Penn Course Review. These sheets ask questions about the difficulty of courses and quality of instruction. And, as already mentioned, the answers must be numbers -- anything from zero to four.
The bubble sheets do provide room for additional written comments. But those comments are reviewed only by the University's departments. The Penn Course Review never even gets to see them; it only receives the numerical data, which it then posts on its Web site.
And that's essentially what the Course Review has become: an online repository of numbers. Which can be helpful at times. The site does allow students to organize the numbers according to which professors score highest in a specific course.
But the data's usefulness is limited. A professor who consistently receives fours probably teaches a great class; that much seems obvious. But how to compare a teacher who receives a 2.5 to one who receives a 2.8? Is one really better than the other? And if so, how?
No sample, regardless of size, can communicate the differences in teaching style between professors. The data don't tell students anything about whether an instructor prefers lecturing or classroom discussion. Exams or papers. Coddling or discipline.
And that's a shame, because these are details students should know when deciding which classes to choose. So here's a digit-less proposal for this digital age: Let's return to the Course Review's old format. Let's add some text.
In its first 43 years of publication, the review printed several paragraphs about each course, describing the course's various curricula, professors and eccentricities. The information for each paragraph was culled from student surveys, and the review often quoted pithy survey answers.
"Boewe knows his stuff but tends to read sexual implications into everything," said one student about his literature professor in the 1959 guide. That was probably helpful to know, given that the course mostly dealt with Puritan works.
The survey paragraphs were eliminated when the review moved online in 2002; they were replaced in 2003 with an online feedback mechanism through which students could comment on courses and teachers. Unfortunately, the comment submission part of the site has been down for the past year.
"We're still looking for one or two more Web developers," said Dan Strigenz, the review's new sophomore editor-in-chief. "The primary goal is to get the comment feature up and working again."
Strigenz expects the comment function to launch again this fall. I look forward to that but want more. After all, once the comments start piling in, it will be hard for students to navigate hundreds of their peers' opinions when choosing courses.
Students would have a much easier time reading a paragraph that synthesized all comments into a general consensus. Unfortunately, the Course Review currently lacks the staff to comb through comments; it employs only six volunteer students.
Perhaps the Course Review could sell ads on its Web site and pay editors. That would certainly attract more students. Besides, the old paperback Review sold ads, and Stangle knows ads have value. She wrote a few sentences 15 years ago to communicate her real nature -- which is exactly what her text wound up doing.
Today, she works as a copywriter at an ad agency.
Gabriel Oppenheim is a College freshman from Scarsdale, N.Y. Opp-Ed appears on Fridays.
The Daily Pennsylvanian is an independent, student-run newspaper. Please consider making a donation to support the coverage that shapes the University. Your generosity ensures a future of strong journalism at Penn.
DonatePlease note All comments are eligible for publication in The Daily Pennsylvanian.