U.Va. Sociologist's Study of College Learning Sparks National Attention

Listen to the UVA Today Radio Show report on this story by Brevy Cannon:

January 25, 2011 — College has been called the new high school – a requirement for success in our information economy. But what do students really learn in college?

A new book, "Academically Adrift: Limited Learning on College Campuses," by sociology professor Josipa Roksa of the University of Virginia's College of Arts & Sciences and Richard Arum, a professor of sociology and education at New York University, tries to answer that question and finds plenty of cause for concern.

They found that among more than 2,300 undergraduates at 24 institutions, 45 percent of students "demonstrated no significant gains in critical thinking, analytical reasoning, and written communications during the first two years of college." A follow-up study released last Tuesday in tandem with the book found that after four years, 36 percent of students did not demonstrate significant improvement. 

Data collection for the book began in 2005, while Arum was Roksa's doctoral adviser at NYU, and Roksa's expertise in higher education made her a natural partner for the project.

"When we started working on this project, we did not know what to expect," said Roksa, who has been an assistant professor of sociology at U.Va. since completing her Ph.D. in 2006. "We started the project being primarily interested in examining racial/ethnic and socioeconomic gaps in learning. There are many studies on this issue in K-12, but not in higher ed.

"When we collected the data and started examining it, we realized how little learning was going on and how little effort students were investing in academic pursuits. My passion is higher education, and I believe in using empirical evidence to identify its strengths and weaknesses, and use that knowledge for improvement," added Roksa, who has a courtesy appointment with the Curry School of Education and affiliation with Curry's Center for Advanced Study of Teaching and Learning.

Media interest in the book has been strong, with coverage in the Chronicle of Higher Education,  the New York Times, Associated Press and other major media.

"It is an honor to be a part of a national conversation about the future of higher education, and potentially, a part of the transformative force for change," Roksa said. "There are always questions about methods and measurement, but the overall message, the overall sense that something is amiss in higher education, seems to resonate within and outside of academia."

However, in the book, Roksa and Arum warn against overreacting to the findings.

This study is one of the first times a cohort of undergraduates has been followed over their four-year college careers to test their learning of specific skills, Roksa said.

In K-12 education, large, nationally representative datasets have tracked tens of thousands of students for decades, allowing researchers to carefully study learning, she said. By contrast, in higher education, there are no such nationally representative datasets that include measures of learning.

"Our study focuses on a small number of schools and students, and uses a specific set of measures. We need to learn a lot more about predictors of learning," Roksa said. "Assessment of learning, particularly pertaining to general collegiate skills such as critical thinking, analytical reasoning and writing, is in the early stages of development."

Rather than government-imposed accountability measures, Roksa said, colleges themselves ought to decide how best to promote academic rigor.

Changing student cultures and incentives will likely be part of the solution, but the current state of learning in higher education is a responsibility shared by students, faculty, administrators, policymakers and parents, Roksa said. "All of those stakeholders bear responsibility for change."

Key Findings and How Students Were Assessed

For Arum and Roksa's study, student learning was measured with the Collegiate Learning Assessment, a voluntary, 90-minute, essay-type test that includes real world problem-solving tasks – such as determining the cause of an airplane crash – that require reading and analyzing documents ranging from newspaper articles to government reports.

The test has been used since 2002 by more than 400 colleges and universities.

(U.Va. was not part of the new study because U.Va. students do not take the CLA, said Lois Myers, university assessment coordinator and associate director of U.Va.'s Office of Institutional Assessment and Studies. U.Va. does measure student competency and learning with state-mandated assessments. University first- and fourth-year students have taken the National Survey of Student Engagement four times since 2000, and results are available here.

The 2,322 undergraduates tracked in the study are ethnically, socioeconomically and geographically representative of the nation's student population, Roksa and Arum note.

Similarly, the 24 study schools (which participated on the condition that they not be named) are geographically and institutionally representative of a broad range of American higher education, the authors said.

The tracked group of students took the assessment three times in their college careers – in the fall of 2005, the spring of 2007, and the spring of 2009. Results from the first two tests appear in the book, while the 2009 test results were summarized in an accompanying study presented Jan. 18 at the Washington offices of the Social Science Research Council, which co-sponsored the research.

While 45 percent of students "demonstrated no significant gains in critical thinking, analytical reasoning, and written communications during the first two years of college," even those students who did show improvements tended to show only modest gains, the study found. Those who entered college in the 50th percentile of their cohort would rise to the equivalent of the 57th percentile after two years, and the 68th percentile after four years.

The results are consistent with students' own accounts of their college experiences, the authors found.
50 percent of students did not take a course requiring 20 pages of writing during the prior semester.
One-third of students did not take a course requiring 40 pages of reading a week.
On average, students reported spending only 12 hours per week studying and they met with their professors outside of the classroom rarely, if ever. By contrast, students spent 85 hours a week – roughly half of their time – socializing or in extracurricular activities.

After controlling for students' family income, educational backgrounds and other factors, higher assessment scores correlated with three variables: spending more time studying, having professors who hold high expectations, and taking in the previous semester at least one writing-intensive course (more than 20 pages of writing) and at least one reading-intensive course (more than 40 pages per week).

All of those findings point toward the importance of academic rigor and expectations, Roksa said.

— By Brevy Cannon

Media Contact

H. Brevy Cannon

Media Relations Associate Office of University Communications