A scooter rider pauses to check directions while moving through the campus of the University of Colorado, Friday, July 19, 2024, in Boulder. (AP Photo/David Zalubowski)

The University of Colorado has delayed student access to a new artificial intelligence platform after announcing a $2 million partnership with OpenAI that has drawn swift backlash from students and faculty who say they were not consulted in the decision.

The three-year agreement, announced Feb. 11 by University President Todd Saliman, will provide students, staff and faculty with access to a CU-specific version of ChatGPT Edu, OpenAI’s version of the chatbot tailored for universities.

Users will be able to log in with their university-issued email address to access capabilities like data analytics, image generation and deep research. Each campus and the system office will manage its own secure instance of the tool. While faculty and staff will begin using the system this spring, student access has been pushed from March 31 to Aug. 14 following criticism from across the university community.

University officials have said the tool will be optional for students and faculty, though instructors will ultimately determine how or whether it can be used in their classrooms. Adoption will likely vary considerably by discipline. It is not yet clear how widely it will be integrated into coursework or campus systems. 

The decision, made without a public comment period, has sparked debate across the Boulder campus over how CU should adopt rapidly evolving AI tools and what role faculty and students should play in those decisions.

A group of CU Boulder researchers and instructors has organized an open letter opposing the agreement. The letter cites concerns about privacy, academic integrity, the impact of large language models on critical thinking and what they describe as a lack of transparency in the decision-making process. The letter has about 800 signatories.

The contract was recommended by a university AI Working Group composed of two representatives from each campus and the system office. The two Boulder representatives are from the Office of Contracts and Grants and the Office of Information Technology. Of the 10-person working group, three are professors — two from UCCS and one from CU Denver. The authors of the dissent letter argue that “no CU Boulder professor with sufficient expertise in AI was consulted on this agreement.”

According to CU’s website, the working group evaluated vendors based on principles including privacy, security, fairness, transparency and human-first design. 

“This initiative is intended to help ensure that every student has the opportunity to explore this technology and be prepared to engage with it in a rapidly evolving workforce,” Saliman and the university’s four campus chancellors wrote in a joint announcement. They described the effort as a matter of equity in access to educational tools. CU has also said that OpenAI will not use university data to train its models. 

University officials said they selected OpenAI in part because ChatGPT is already the most widely used AI tool among CU students and staff. The agreement, they said, is “time limited and evaluative,” allowing the university to reconsider its approach in the future.

CU has not provided detailed information on how the contract breaks down financially. The $2 million covers annual licensing costs for 100,000 users. Because the agreement spans three years, the total cost could be significantly higher. Officials have said the first year will be funded by the system office, and individual campuses will assume the cost after that. CU Boulder Provost Ann Stevens wrote, “What it ultimately costs CU Boulder will depend on how the agreement is evaluated and whether it continues beyond the initial term.”

University of Colorado President Todd Saliman at the University of Colorado Anschutz Medical Campus Saturday, April 20, 2024, in Aurora, Colo. (AP Photo/David Zalubowski)

Still, critics have pointed to OpenAI’s broader business relationships and political ties, as well as environmental concerns tied to AI systems’ energy use. Others questioned whether the $2 million cost could ultimately be borne by students. A lack of transparency has been a recurring point of frustration for members of the CU community.

In an online survey conducted by Flynn Zook, a CU Denver student, nearly 300 respondents weighed in on the agreement. Fewer than 10 expressed clear support, while a small number said they were undecided. Many respondents raised concerns about environmental impact, intellectual property and the potential use of tuition dollars to fund the initiative. 

“The university claims to be standing by its commitments to sustainability, but they’ve given no concrete examples as to how they’re going to do that,” Zook said. “There’s no way for the community to oversee or enforce that.”

In response to criticism, CU Boulder Provost Ann Stevens acknowledged the lack of broader consultation. “This contract is not the end of the conversation,” she wrote. “It is the beginning of a more structured and inclusive one.” 

The dissent letter calls for a faculty-led process to establish ethical guidelines, develop AI literacy training and shape how the university communicates about the partnership.

Meanwhile, the CU Board of Regents is considering a broader policy framework to address what it describes as the “ethical, legal and operational challenges” of generative AI. The policy would set high-level guardrails, with more detailed rules to follow.

It remains unclear what opportunities, if any, students and faculty will have to formally shape the agreement before student access begins in August.

The decision to delay student access was announced March 19 in a systemwide email from Saliman, citing concerns from faculty about disruption to the learning environment and affirming instructors’ authority over classroom policies.

The decision could shape how thousands of CU Boulder students write, study and are evaluated, even as the university continues to define rules around AI use.

“I’m very excited that we got a delay,” Zook said. He hopes it will allow time to develop clearer guidelines. But he said he would still prefer the university to abandon the contract altogether. 

“I’m so tired of the excuse that AI is inevitable,” said Zook. “It is 100% our choice.” 

McKenzie Watson-Fore is a writer, artist and critic based in her hometown of Boulder. She is the executive editor of sneaker wave magazine and the founder and organizer of the Thunderdome Conference. You can find her loitering around Pearl Street, drinking oolong tea on her back porch, or online at MWatsonFore.com.

Leave a comment

Boulder Reporting Lab comments policy
All comments require an editor's review. BRL reserves the right to delete or turn off comments at any time. Please read our comments policy before commenting.

Your email address will not be published. Required fields are marked *