Report: Coping With Artificial Intelligence Will Take Some Real Work

August 28, 2023 By Matt Kelly, mkelly@virginia.edu Matt Kelly, mkelly@virginia.edu

Generative artificial intelligence has the power to shape the world we all live in, so learning how to harness and deploy that power should be part of a University of Virginia education, a new faculty report says.

Just don’t use ChatGPT to write your papers for you.

AI and its implications have been on the minds of UVA faculty for a while, especially after OpenAI released ChatGPT, a computer program that mimics human writing, in November. Last spring, a seven-member AI task force, which included an Honor Committee representative, brought together more than 300 faculty members for cross-Grounds discussions and collected survey responses from 504 students and 181 faculty members, and gathered input from other universities.

The task force, which completed its report this summer, looked into how the emergence of this new technology may affect what and how students learn and how it is assessed. It found that faculty want students to develop artificial intelligence literacy – but with faculty guidance to use AI properly to learn core material more effectively and efficiently.

“AI can serve as a powerful and effective tutor,” the report stated.

There is, however, a “but.” While artificial intelligence can provide students with information, task force members think it could also disrupt important elements of the learning experience by reducing the opportunities for students to turn to professors for help and guidance with a subject. The report marks two sides of the artificial intelligence coin, providing opportunity to learn more while posing the threat of learning less.

Related Story

‘Inside UVA’ A Podcast Hosted by Jim Ryan
‘Inside UVA’ A Podcast Hosted by Jim Ryan

“On the upside, high-quality, individualized tutoring has the capability to increase student learning in many subjects,” the report said. “On the downside, AI can function as an undetectable and endlessly customizable paper mill, undermining learning across Grounds.”

The task force advises faculty members to be precise in their expectations regarding artificial intelligence use in their courses.

“Students need clear guidelines for how and when generative AI can be used,” the report advised. “In the spring 2023 semester, 77% of student survey respondents reported that their instructors did not make their AI policies clear to them. In designing learning experiences, instructors must be transparent about how and why AI may, should, or may not be used in the course and in specific assignments.”

Once students achieve a better understanding of artificial intelligence, faculty may delve deeper into course content or expand course goals. Faculty members may also need to reassess what writing skills they want students to develop.

“Generative AI can short-circuit learning: students can outsource the work of research, analysis, and writing to these tools,” the report said. “Yet if we help students write with AI in deliberate, meaningful ways, these tools can become powerful writing partners for students and can augment their learning. AI can aid writers in generating new ideas, providing samples and models to follow, offering generalized feedback, helping writers get unstuck and assisting in the editing process.”

The task force report said faculty members may change their assessment process to thwart artificial intelligence, including in-class, pen-and-paper examinations and reducing take-home work. At the same time, task force members caution faculty against using tools that promise to detect AI because they generally prove unreliable.

The provost’s office, through its Q&A pages, discourages faculty from using AI for grading, to safeguard students’ work.

Brie Gertler headshot

Artificial intelligence in work, research and academia will continue to evolve, according to Brie Gertler, vice provost for academic affairs. (Photo by Dan Addison, University Communications)

“Students’ original work is, in most cases, their intellectual property, and thus instructors may not enter a student’s original work into an AI tool that will add that work to the tool’s data set,” the answer read. “AI tools are not effective for grading most kinds of assignments, including writing assignments. However, some AI tools can help to ease the grading process, for example by organizing the work and facilitating the use of rubrics.”

The provost’s office also advises students against entering sensitive, personal or proprietary information into an AI program because the programs use new information to help evolve.

While the report stressed the need to adjust to AI, it said there are many obstacles to harnessing it effectively, including limited AI literacy. AI is still rapidly evolving and some faculty are not prepared to reassess their teaching goals and methods, while others only see the downside of artificial intelligence.

The provost’s office encourages faculty members to become familiar with AI applications, because their students already are. But the office also cautions students that using AI to complete assignments may violate the University’s Honor Code.

The task force composed a series of recommendations, including a "Teaching in a Generative AI World” website, launching an AI-centered assignment design workshop, launching a series of AI workshops in the fall, developing AI expertise at the Center for Teaching Excellence, and offering programs for graduate students and others with instructional responsibility.

Other recommendations include recruiting faculty to launch AI-learning conversations within their departments, funding AI-related grants, developing field-specific AI learning techniques for the central website, and considering a teaching and learning conference.

“This technology is increasingly shaping research, communication and the world of work, and its role in higher education will continue to evolve,” wrote Brie Gertler, vice provost for academic affairs, in a letter to faculty members.

“The Task Force Report provides a fascinating and thoughtful picture of how this technology is already affecting teaching and learning in higher education, and notes that faculty who shared their views with the task force expressed nearly unanimous support for ensuring that students develop AI literacy,” Gertler wrote.

Media Contact

Matt Kelly

University News Associate Office of University Communications